BJS response to OMB passback 8/31

BJS Response to OMB second passback.docx

2012 Census of Problem Solving Courts (CPSC)

BJS response to OMB passback 8/31

OMB: 1121-0337

Document [docx]
Download: docx | pdf

From: Martinez, Shelly [mailto:Rochelle_W._Martinez@omb.eop.gov]
Sent: Thursday, August 30, 2012 05:56 PM
To: Adams, Devon
Subject: RE: CPSC Updated SS materials
 

Devon – we have a few little things outstanding, below.  We inserted our comments after the BJS initial response to our questions.  Please ask the team to address these items.  Shelly



  1. Please reconcile the burden estimate of 30 minutes in SS A12 and that of 1 hour on the front of the questionnaire.  In addition, please make consistent the ROCIS numbers (3,854 and 1918) and the SS numbers (3800 and 1900).


Additional text and a table were added to Part A Section 12 clarifying the apparent discrepancy (see track changes).


OMB: We are concerned that the burden on the original questionnaire ranged from 15 minutes to 150 minutes. From the passback, we understand that the revised questionnaire has not been tested to estimate time to complete. It is estimated that the questionnaire will consume 30 minutes on average, but as noted, the average belies a wide variation in actual administration times. Please provide some evidence that there will not be a slew of participants who will take 150 minutes to complete the revised survey, hidden behind the burden as estimated for the average timing.


BJS: Please see the newly added text in Part A Section 12 addressing the 150 minute issue and providing additional detail on NCSC’s in-house testing of the revised questionnaire. Additional text addressing the issue was written into Part A Section 5. See track changes.


  1. Please provide a screen shot of a page or two of the questionnaire (actual questions, not just welcome page).


An updated screen shot is included with this round of revisions.

Shape1

OMB’s recommended changes were made and highlighted using MS-WORD’S comments feature. While making the requested changes, we found other questions that could be clarified. (See questions 8, 15, 23, 30, and 31.) We made these changes (see track changes) for your review. We are willing to go back to the original versions of these questions if they do not meet OMB’s standards.


OMB:  Thank you for informing us of the updates and clarifications to other questions. 

  1. For question 15, should proportion be changed to percentage for the sake of consistency?


BJS: Proportion changes to percentage. See track changes.


  1. In questions 30 and 31, what does ‘data element not applicable’ mean? In what situation do you think a respondent will endorse that option?


BJS: Q30 – “data element not applicable” is necessary if respondents answer “no” to Q12 or when the answer to Q13 is a date in the future. Q31 – The administration of PSCs is unique and some types of offender participant exit (e.g., general discharge) may not be an “exit option” in that court.


  1. Questionnaire

    1. the first item under question 1 may be contrasted against the second question under item 1 better if the word “single” were inserted in the phrase “Identify the SINGLE category label…” 


The recommended changes were made to the questionnaire.


    1. question 1, Item d. Your professional information – this is vague.  What is this asking and why is it necessary?  There are a lot of items about the person when in fact you presumably already have contact information given that you contacted them in the first place from the SPOC contact information.


The question was expanded to ask for specific information.


    1. Question 4, what if the person doesn’t know about the “creation” period, which apparently could have been 20 years ago?


Respondents now have the option to indicate the information is not currently available and then skip the question.


OMB: Actually the new option reads: “Stakeholder information is not available”. Someone may know the stakeholders, who may be continuing to play a role in the program, but not have been present to know what occurred during the ‘creation’ period. Are you instead really trying to capture “Information about planning efforts is not available” or simply a “Don’t know” to avoid confusion?


BJS: Text changed to “Information about planning efforts is not available.” See track changes.


    1. Item 7 – does “poor jurisdictional compliance” mean that the city or county is not complying?  Clarify.


The typo “jurisdiction” has been corrected to “offender.”


    1. Question 9 – this is way too long and poorly structured.  Suggest “For each of the following key stakeholders, does the Problem-Solving court mandate any type of training (e.g., formal training curriculum, information brown bag sessions on key topics) specific to the needs of program participants (e.g., about underlying causes of their justice system involvement, or relevant health or behavioral problems like drug addiction, mental illness, sex offending, domestic violence)? 


The question was revised to improve clarity. Please review this revision to see if it addresses OMB’s concerns.


OMB: In the revision, the response options no longer match the actual question stem. We think you still need “For each of the following stakeholders” to make the question work. In addition, the “their” reference is confusing; presumably this refers to the program participants’ justice system involvement, not to the stakeholders’ justice system involvement. The parentheses seem to make this referent unclear.


BJS: Text has been modified per OMB suggestion and additional language has been added to clarify “their” as referring to offenders. See track changes.


    1. Item 28 – is b asking about aggregate or individual data?  As written, this seems unclear and if interpreted not as intended, the wrong answer could easily be provided.


The question was revised to improve clarity. Please review this revision to see if it addresses OMB’s concerns.



  1. In the first phase of data collection (directed at the SPOCs for list building), BJS needs to provide the IC (the referenced spreadsheet template) specifying the variables listed in Part B.  It sounds like the IC (or perhaps a second one) should also be capturing in a standard way the judgments or methodology used to identify PSCs, per the discussion in SS Part B.


The Excel template (attachment 13) is now provided, which specifies the variables to be collected from SPOCs. Language included on the template and attachment 10 provides specific instructions to SPOCs regarding the inclusion of PSCs and systematically provides a question to incorporate any state’s definition of PSCs. (See Part B Section 1 page 4.)


OMB: The word ‘exclude’ seems easy to overlook in the first line of instructions on the Excel template. Perhaps underline it for the purpose of clarity.


BJS: Exclude has been underlined.



  1. Please confirm that at the end of the list building phase, BJS plans to return to OMB for approval if the number of PSCs exceeds the estimate, therefore raising the burden estimates.


Language was added in Part A Section 12 and Part B Section 1 of the SS indicating BJS will return for OMB approval if the number of PSCs exceeds the estimate.


OMB: Thank you for the clarification.


In the discussion of response rates in Statement B, word or words appear to be omitted after 90, which presumably means you expect a 90% response rate.

BJS: We are willing to make the correction. Honestly, we cannot find 90 without the “%” or 90% without a reference to response rate. If you could please highlight the issue using MS-WORD, we will make the change.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRon Malega
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy