B. Statistical Methods
1. Universe and Respondent Selection
The Annual Surveys of Probation and Parole (ASPP) are designed to collect probation and parole data from community-supervising jurisdictions within each state. The universe includes state, federal and locally administered probation and parole departments. Information is collected from central reporters within each state wherever possible so as to minimize the burden on individual agencies. For probation, there are 468 respondents. These include 33 central state reporters; 433 separate city, county, or court reporters, including the state agency in Pennsylvania that also reports data for 65 counties; the District of Columbia; and the federal system. For parole, there are 53 respondents, including 49 central state reporters; the state parole agency in Pennsylvania which also provides data for 65 counties in Pennsylvania; the District of Columbia; and the federal system.
BJS and its data collection agent for the ASPP, Westat, have used various methods to ensure the accuracy and the completeness of the population frame for each survey:
Agency staff provide information about newly formed, merged, and closed supervising agencies during the data collection process. This information is used to update the frame prior to the start of each data collection year. Such methods, for example, have helped to alert BJS of changes in state parole supervision authority, such as those resulting from the Public Safety Realignment in California in 2011. Under the realignment, the California Department of Corrections and Rehabilitation, Department of Juvenile Justice no longer supervises parolees; therefore, the agency has been removed from the parole frame.
Close attention is paid to unexplained change in the total population that occurs from the end of one year to the beginning of the next, as well as to growth or decline in the total population during the current reporting year. During survey administration, a comparison is made between the previous yearend population to the reported beginning year population – if there is a difference of 10 percent or more, respondents are prompted to review their data, and to enter a reason for the discrepancy. If the growth in the total population during the current reporting year (from January 1st to December 31st) exceeds 10 percent, respondents are prompted to enter information about the reason for the growth or decline.
Following data submission, all data are reviewed by Westat. In any given year, any parole agency and any probation agency with a population of 100 or more whose previous yearend population differs by more than 5 percent from that of their reported beginning year population is flagged for review and potential follow up, as are probation agencies with a population of less than 100 if the difference exceeds 10 percent. Westat also reviews the information provided by agencies when January 1st to December 31 growth for the current reporting year exceeds 10 percent.
During follow up, Westat uses open-ended probes to determine the reasons for differences in yearend to beginning of year population, as well as for potential follow-up if the reason for the population growth during the current reporting year is not clear. Differences may be explained by the correction of a data entry error, a reporting method change, a change in the agency’s responsibility (e.g. an agency has taken responsibility for probationers or parolees that were previously supervised by another agency; another agency has taken responsibility for a portion of the probationers or parolees for which the respondent’s agency previously had provided supervision; a change in law), or, in the case of within reporting year change, to genuine growth or decline of the population rather than a methodological change in reporting.
Starting with 2014 collection year, respondents to the Annual Probation Survey will be asked to respond to three additional questions to ensure that information is being reported for all probation agencies that fall within the scope of the collection (see Attachment 8, proposed 2014 Annual Probation Survey, Form CJ-8, items 17, 18 and 19).1 The information from the 2014 survey is expected to enable BJS to determine whether any probation agency that should have been included has been erroneously excluded. In subsequent data collection years, the information from these three items will be used to detect whether there has been any change in the agencies for which information is being reported.
For item 17, respondents will be asked to specify the probation agencies for which they have provided information. To facilitate the process, item 17 will include a preliminary list of all independent probation agencies known to supervise adult probationers in each state. Lists of probation agencies, by state, have been developed in preparation for the 2014 Census of Adult Probation Supervising Agencies (CAPSA).2 In the event the respondent has included information for a probation agency which is not on the list, the respondent will be asked to provide the agency’s name and location in item 18.
To ensure that respondents are considering the entire universe of adults on probation in their state, item 19 will ask the respondent to mark a checklist to indicate the level(s) of court responsible for placing adults on probation in the agencies for which they report. A state-specific checklist of all levels of state courts responsible for criminal proceedings which might result in adults being placed on probation will be provided, based on charts published by the Court Statistics Project of the National Center for State Courts (http://www.courtstatistics.org/Other-Pages/State_Court_Structure_Charts.aspx). These charts were last updated in 2010, but have been very stable over time.
Following data collection, BJS and Westat will use the information to check whether each level of court that is responsible for placing adults on probation supervision in each state has been checked. The information from the court checklist will be compared with the information from items 17 and 18; with information from existing sources (e.g., state websites); and with the final roster of agencies defined through the CAPSA collection to locate any additional agencies that fall within the scope of the Annual Probation Survey that are not being counted. Items 17, 18, and 19, together with these other sources will allow BJS to develop a comprehensive list of adult probation agencies that should be included in the Annual Probation Survey with the least amount of reporting burden. (See part B, item 4, “Testing of Procedures,” for information on reporting burden associated with these three items.)
Following the review of the probation agencies included in the Annual Probation Survey, the court checklist is expected to enable BJS to more clearly define the universe in terms of the courts that are responsible for placing adults on probation supervision. In addition to enabling individual states to have a better understanding of what is included when they compare their state to another state(s), this improved transparency is expected to result in greater confidence in the total national count of the number of adults on probation supervision. Continuing use of items 17, 18, and 19 in subsequent data collection years will provide BJS with a tool to continuously monitor data collection coverage.
2. Procedures for Collecting Information
BJS emphasizes the web as the primary mode of data collection; with hardcopy forms being sent to respondents upon request (see part A, item 3, “Use of Information Technology” for more information). All agencies receive a survey invitation letter requesting that they complete the survey on the web (Attachment 13). The letter explains the importance of the survey and that no other sources are available to provide the data requested in the surveys. It also provides a link to the most recent BJS Probation and Parole in the United States bulletin, states that participation is voluntary, and thanks them for their involvement. Each agency is provided with a unique user ID and password to access the survey website to complete the questionnaire.
Respondents are asked to submit their data by the due date indicated on the web and paper forms – February 28 following the end of the reference year. Westat, BJS’s data collection agent, sends reminders and accepts surveys until the final data collection cutoff; a point in time that is determined by response rates and the ability to produce reliable national-level and state-level estimates in all states.
Approximately four weeks into data collection, preliminary analysis begins. Westat staff check the data for out-of-range values, missing data, and other types of responses that need data editing/cleaning. These preliminary analyses are undertaken while data collection is still in progress so as to provide time for making callbacks to clarify data.
Follow-up efforts are conducted throughout the data collection period either to resolve discrepancies or to learn more about unreported counts. If critical items are missing or inconsistent, such as the beginning year or yearend population or the number of entries to or exits from supervision, Westat staff contacts those respondents to determine if they can provide estimates of the unreported quantities or provide an explanation for inconsistencies (see Attachment 19, Request to Discuss Inconsistencies). Westat works with the respondents to estimate missing information, making sure to obtain clearance from the respondent before disseminating data containing any revisions (see Attachment 20, Follow-up Letter to Agency Head Regarding Data Revision).
BJS has developed several imputation methodologies if respondents are unable to provide any information to estimate entries and exits. Different methods are used depending on the circumstances. Specific methods, and the jurisdictions to which they apply, are documented in the “Methodology” section of reports in the series “Probation and Parole in the United States”3 The imputed values are used for all analyses and reports published by BJS. Survey data that have been imputed are flagged as such in the files that are sent to the National Archive of Criminal Justice Data (http://www.icpsr.umich.edu/icpsrweb/NACJD/index.jsp) maintained by the Interuniversity Consortium for Political and Social Research at the University of Michigan Inter-university Consortium for Political and Social Research at the University of Michigan.
After all follow-up efforts, data cleaning, estimation, and analysis are completed, the report is written and the data are released to the public within six months after they are collected. In addition to the published report, the data are made available through a Department of Justice press release which is posted on the BJS website.
3. Methods to Maximize Response
BJS and Westat employ several techniques to maximize response rates:
Establish contact with the agencies prior to the start of data collection and make frequent contact during the period (described below) to solicit participation.
Make it easy for agencies to participate by providing technical support and other help with the survey as needed, offering a response mode other than web if requested, and providing respondents with real-time online data checks to add efficiency to the response process.
Engage respondents in the data collection process by producing reports that provide information that respond to agency needs (see http://www.bjs.gov/index.cfm?ty=tp&tid=15#pubs).
Analyze response patterns to determine the most effective methods for contacting and following up with agencies.
The CJ-8A (Short Form) is provided as a data collection option to smaller agencies. This has been shown to improve overall data quality (see section A, item 5, “Impact on Small Businesses or Entities/Efforts to Minimize Burden” for more information).
Westat has developed a Survey Management System (SMS) that provides BJS and Westat data as needed to monitor the progress of the data collection. The SMS maintains data about respondents and non-respondents, their response characteristics, and their communications. This information is available for use throughout the data collection period to inform and enhance non-response follow-up. BJS currently has real-time access to the SMS which includes the following data:
Agency contact information (e.g., names of agency heads and designated respondents, street and email addresses, telephone numbers);
Individual files containing the image of each submitted survey from the current year, including notes provided by the respondents;
Mode and date of survey submission;
Notes describing contacts with agencies as well as follow-up efforts; and
Statistics on the current year’s overall response rates and the response rates for each survey type.
During
the collection cycle, BJS and Westat analyze the data collected to
assess response patterns (e.g., are the same respondents consistently
late responders or do the patterns vary) and missing data on
submitted forms, and to develop strategies to address the timeliness
and completeness of data submissions.
To draw attention to the ASPP collection in advance of the formal request to participate, a pre-notification letter is mailed and emailed to agencies in mid-November (Attachment 12). The letter provides information about the purpose and importance of the surveys as well as the type of information to be requested so they can plan to retain the yearend information that they will need. A Designation Form is included so that the agency head can select the most appropriate person to respond to the survey.
In addition to materials provided to respondents in the pre-notification letter (see part B, item 2 “Procedures for Collecting Information”), other communications inform respondents of the status of data collection or serve to remind them to respond. These include the following:
Automatic thank-you emails are sent to those that have submitted their web survey (Attachment 15).
Three reminder emails are sent to non-respondents throughout data collection.
The first is sent to alert respondents of the impending survey due date (Attachment 14).
The second is sent two weeks after the survey due date (Attachment 16).
The third is sent two weeks before the final cutoff of data collection (Attachment 17).
Telephone calls, as a reminder to non-respondents, were added to the 2013 data collection activities. Calls are made to non-respondents one week after the survey due date has passed. The scripts are tailored to the size, type and reporting history of the agency. Either the Westat project manager or project director calls the agency head of those that did not submit both the current and prior year surveys. The Agency Support Team calls the other non-respondents to encourage participation (Attachment 18).
Additional follow-up is conducted as needed with non-respondents that indicate a need for more time to provide data. Follow-up contact by telephone is attempted to resolve data discrepancies and obtain answers to items left unanswered in the survey (Attachment 19) (see part B, item 2 above “Procedures for Collecting Information”).
In the 2012 and 2013 data collection cycles, BJS implemented additional measures to maximize response and shorten the time required by agencies to submit their data:
As mentioned above, agencies receive up to three reminders to submit their surveys. The first reminder was moved up by one week in the 2013 data collection. The third reminder was added to the protocol for the first time in the 2012 data collection (sent in spring 2013).
Starting with the 2012 collection, Westat began sending web survey invitations which include login instructions for the web survey in both hard-copy form through the U.S. Postal Service and in electronic form, as attachments to email messages.
The Westat project director and project manager were directed to contact all large agencies that have not responded by the survey due date in order to emphasize the importance of the study and to determine the best way to encourage participation.
For the 2013 collection, the Westat project director was also directed to contact large agencies that had not provided data before June in previous years. Given BJS’s goal of ending data collection and follow-up by mid-May, the goal of these contacts is to examine ways to support these agencies in their effort to submit data earlier. This may include reviewing the information submitted by the agency during the previous data collection cycle. All agencies that were approached agreed to submit their surveys by the end of April 2014.
BJS instituted a practice of sending a close-out letter to non-respondent agencies once the data collection period ended. The letter indicates that the agency will be contacted as part of the next round of data collection and encourages the agencies to contact Westat for information about the request or for assistance in preparing for the survey. One of three versions of the close-out letter is sent depending on whether the respondent submitted no data, partial data, or data that required clarification that was never received (Attachments 22, 23, 24).
A comparison of the 2012 and 2013 response patterns suggests that the collective effect of these efforts has been successful in achieving earlier responses. For example, we find that at the beginning of February during the collection cycles, there was very little difference in response rates (less than 1 percent different). However, as of the February 28th due date, 42 percent in the 2013 collection had responded, as compared with 35 percent of agencies during the 2012 collection. This difference was maintained during the following weeks (e.g., in mid-March, the rates were 61 percent for the 2013 collection compared with 53 percent for the 2012 collection). As of April 30, 2014, 84 percent of the 2013 surveys were submitted, compared to 78 percent in 2012. BJS plans to continue work with its data collection agent and with respondents to complete data collection earlier.
Over the past several years of the ASPP surveys, these methods have enabled BJS to achieve a minimum survey response rate of 93 percent. In 2012, the response rate for the Annual Probation Survey was 93 percent (representing 99 percent of the 2012 yearend probation population) and the response rate for the Annual Parole Survey was 96 percent (representing 99 percent of the 2012 yearend parole population) (table 1).
Notwithstanding these efforts, unit and item nonresponse continue to be a concern. Over the past several years, rates of unit and item nonresponse have been relatively constant for the parole surveys (table 2). For the probation surveys, however, unit nonresponse has increased since the 2011 data collection (table 3).
The increase in unit non-response for both the probation and parole surveys has led BJS and Westat to develop strategies to impute missing data for key items, such as: the beginning of the year count, total entries, total exits, and the end of year count. Imputation methods are documented in annual reports published by BJS.4
BJS will continue to work with Westat to address both unit and item nonresponse by working with respondents to obtain more timely data submissions and to identify the reasons for unit non-response.
4. Testing of Procedures
Attachments 7, 8, and 9 provide the proposed 2014 Annual Parole Survey, Annual Probation Survey Long Form, and Annual Probation Survey Short Form. The arrangement of items on the forms reflects a logical flow of information to facilitate comprehension of requested items and to reduce the need for follow-up. In addition, instructions and definitions are included for each item, where necessary and have been revised during previous OMB submission cycles when feedback from respondents and users indicated a need for clarification. In addition, respondents are provided a link to the bulletin from the previous year as a reference point for compiling data, and can print a copy of the data they submit.
External reviewers have found the format of the survey instruments, including item content, item display, and instructions, effective and efficient in collecting needed information while minimizing the burden.
Items 17, 18, and 19 that were added to the Annual Probation Survey, CJ-8 (see Attachment 8) to improve frame coverage were pretested in May, 2014, with a sample of 9 respondents who were selected to include both state reporters and local agencies.5 Respondents found the instructions and wording of the three questions to be clear and easy to understand. Most respondents did not have any trouble using the list of agencies presented in item17 to indicate those with populations included in their probation population counts. The respondents did, however, often point out issues with the quality/content of the lists; the vast majority of these issues will be resolved in CAPSA – before the lists are presented to Annual Probation Survey respondents. None of the respondents made any reporting errors due to these issues. Several respondents expressed surprise that the court types listed for item 19 so closely matched their state court names, without knowing that the lists were tailored to their specific state. None of the respondents spent more than 5 minutes to answer these questions. One respondent needed to call a probation officer about item 19, but it was that officer’s job to respond to questions like this, so it was easy to get the information. Based on these findings, it is believed that these three questions will serve their intended purpose with minimal burden.6
Prior to the last OMB submission, the questions on the web option mimicked the look of the presentation of the questions on the paper version. Starting with the 2011 data collection cycle, BJS redeveloped the web surveys to present one question per screen. Advantages included reduced costs related to data entry (easier to process data, as responses could be downloaded to a spreadsheet, data analysis package, or a database); dynamic error checking capability and the ability to incorporate complex skip patterns, thereby reducing the potential for response errors; the inclusion of pop-up instructions for selected questions; and the use of drop-down boxes.
For the 2012 data collection year, several improvements were made to the web survey in response to feedback from respondents. Specifically, 1) instructions were added to guide respondents from question to question, 2) an option to print a .pdf version of the completed survey was added, and 3) instructions for submitting the completed survey were clarified. (In the previous year, some respondents had to be re-contacted when they entered data but neglected to hit the submit button.)
BJS has been working toward the goal of 100 percent of parole and probation agencies submitting via the web. The changes to the web survey were effective at increasing the use of the web by both probation and parole agencies. As noted in part A, item 3, “Use of Information Technology”, use of the web by respondents increased dramatically to 91 percent in 2012 (48/53) among parole respondents (up from 56 percent in 2007, or 30/54), and 84 percent in 2012 (366/436) among probation respondents (up from 19 percent in 2007, or 89/463).7
Only minimal revisions are proposed to the data collection instruments for 2014: changing the reference year from 2013 to 2014 (Attachments 7, 8, and 9); and adding questions related to frame coverage to the Annual Probation Survey, as explained in part B, item 1, “Universe and Respondent Selection.” (Attachments 8 and 9).
5. Contacts for Statistical Aspects and Data Collection
The Correction Statistics Unit at BJS takes responsibility for the overall design and management of the activities described in this submission, including fielding of the survey, data cleaning, and data analysis. BJS contacts include:
Thomas P. Bonczar, Statistician
Bureau of Justice Statistics
U.S. Department of Justice
810 Seventh St., NW
Washington, DC 20531
(202) 616-3615
Daniela Golinelli, Ph.D.
Chief
Corrections Statistics
Bureau of Justice Statistics
U.S. Department of Justice
810 Seventh St, NW
Washington, DC 20531
(202) 616-5164
1 On the Annual Probation Survey (Short Form), CJ-8A, these are items 8, 9, and 10 (see Attachment 9).
2 A request for an information collection review for CAPSA 1121-xxxx, was submitted to the Office of Management and Budget in April, 2014. CAPSA is planned to be conducted during the summer of 2014, with a reference date of June 30, 2014.
3 See Attachment 4, Probation and Parole in the United States, 2012; other reports in the series are available on the BJS website at http://www.bjs.gov/index.cfm?ty=pbse&sid=42.
4 Ibid.
5 These are items 8, 9, and 10 on the Annual Probation Survey (Short Form), CJ-8A, (see Attachment 9); see part B, item 1, Universe and Respondent Selection” for more information.
6 Based on the pretest with 9 respondents, the burden for the three items added to the Annual Probation Survey (form CJ-8, Attachment 8; and CJ-8A; Attachment 9) to improve frame coverage is estimated to be 5 minutes. The estimate obtained from the pre-test is less than the estimate of 15 minutes per response for the Annual Probation Survey that appeared in the 60-day notice (Federal Register, Volume 79, Number 60, pages 17775-17576 on March 28, 2014; see Attachment 10). The burden estimate in the 30-day notice has been revised accordingly.
In 2007, there was 1 non-respondent each for probation and parole; in 2012, there were 32 non-respondents for probation.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Scarbora |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |