generic request

OMB Generic Clearance Request for JRFC.CJRP Cognitive Testing.docx

Generic Clearance for Cognitive, Pilot, and Field Studies for The Office of Juvenile Justice and Delinquency Prevention Data Collection Activities

generic request

OMB: 1121-0360

Document [docx]
Download: docx | pdf

U.S. Department of Justice

Office of Justice Programs

National Institute of Justice
______________________________________________________________________________
Washington, DC 20531


MEMORANDUM TO: Joe Nye, Senior Policy Analyst

Office of Information and Regulatory Affairs

Office of Management and Budget


THROUGH: Benjamin Adams, Office Director

Office of Crime Prevention and Youth Justice (OCPYJ)

National Institute of Justice (NIJ)


FROM: Kaitlyn Sill, Social Science Analyst, OCPYJ, NIJ


SUBJECT: Generic Information Collection Request: Cognitive Interviewing for the Juvenile Residential Facility Census (JRFC) and Census of Juveniles in Residential Placement (CJRP) through the generic clearance agreement (OMB Number 1121-0360)


DATE: October 19, 2023





Request: The Census Bureau plans to conduct additional research for the Department of Justice, National Institute of Justice (NIJ) and Office of Juvenile Justice and Delinquency Prevention (OJJDP), under their generic clearance for questionnaire pretesting research (OMB number 1121-0360). We will be conducting cognitive interviews to test proposed questions for the Juvenile Residential Facility Census (JRFC) and Census of Juveniles in Residential Placement (CJRP), along with an online survey with cognitive probes.


The Juvenile Residential Facility Census (JRFC) and Census of Juveniles in Residential Placement (CJRP), conducted in alternating years, collect information on attributes and characteristics of juvenile residential facilities (JRFC) and characteristics of the juveniles housed in those facilities (CJRP).


Purpose: The purpose of cognitively testing proposed new questions for the JRFC/CJRP is to assess whether the questions are measuring the underlying constructs of interest and to better understand the accessibility of the requested data and the burden of compiling responses to the questions. The content to be tested builds on OJJDP’s redesign study of the JRFC/CJRP, completed in 2022, which identified promising new content but recommended additional question development and testing to determine burden and feasibility. The feedback from these interviews will be used to refine question wording and decide whether (or not) to include these new questions in the Juvenile Residential Facility Census and Census of Juveniles in Residential Placement. Questions will focus on collecting data on the length of stay, facility classification, demographics, the form of the survey (online vs paper), and beliefs on confidentiality and privacy of the data.


Population of Interest: Respondents from residential facilities housing youth.


Timeline: Testing will be conducted during October, 2023 – February, 2024.


Language: Testing will be conducted in English only.


Method: The purpose of cognitively testing the proposed JRFC/CJRP questions is to minimize measurement error and maximize the validity of these questions by assessing whether the questions accurately measure the underlying construct of interest. Cognitive interviewing is a method of pretesting instruments that involves in-depth interviewing, paying particular attention to the mental processes respondents use to respond to questions1. Cognitive interviewing uses a framework dependent on evaluating questions against their outcome objectives, including accurately eliciting the underlying construct of interest, and to what level of accuracy respondents can provide data in response.


Staff from the Data Collection and Methodology Branch at the U.S. Census Bureau plan to conduct up to 55 moderated interviews over Teams/telephone. The interviewers will follow a semi-structured interview protocol (Attachment A). Interviewers will send a link to an online survey instrument hosted in Qualtrics, an online survey platform, to respondents prior to calling. Respondents will be asked to thinkaloud as they work through the online survey with concurrent probes being asked throughout the interivew. Interviews will be recorded if the respondent consents.


In addition, in the second round of testing, we will collect up to 200 online survey responses with cognitive probes. We will send a self-administered version of the questions via Qualtrics, which will include a smaller number of open and close-ended cognitive probes similar to those used during telephone interviews to assess respondents’ interpretation and experiences with reporting to the survey items.


Sample: We plan to conduct a maximum of 55 moderated cognitive interviews total over two rounds of data collection, and 200 online surveys in the second round. In the first round, to test the new content, we will target 20 to 30 respondents across all types of facilities to complete a cognitive interview about current and new content. We will purposively sample on facility characteristics (i.e., private versus public facility, type of services provided) and how they complete the survey (i.e., paper form versus webform). The research staff will then review the resultant data and make adjustments to the new content questions and to the interviewing protocol if necessary. We will then conduct a second round of moderated cognitive interviews targeting 15 to 25 respondents, again across all all types of facilities. In addition, we will send an online version of the revised questions to up to 200 respondents in round two.


This number of interviews is targeted because it is a manageable number of interviews for the time period allotted, it should adequately cover targeted facilities, and should be large enough to provide reactions to the questions in order to identify meaningful findings.


The sampling universe for the JRFC/CJRP cognitive testing sample consists of facilities in the 2022 JRFC and 2023 CJRP universe for which the Census Bureau has email addresses.


Recruitment: Participants will be recruited using a sample file from the 2022 JRFC and 2023 CJRP universe as described above. First, we will send an email to the contact. This email will include instructions for respondents to schedule an interview time and date. We will verify the appointment time, and respond by email to the respondent with confirmation of scheduling; in that email, we will also verify the best number to reach the respondent. About 30 minutes before their appointment time, we will email the respondent with a reminder of the upcoming appointment and include a link to the survey. The first screen of the online survey will be a Paperwork Reduction Action (PRA) and Privacy Act (PA) statement, informing participants that their response is voluntary and that the information they provide is confidential under 34 U.S.C. § 10231 and and 28 C.F.R. Part 22 and asking for consent to be interviewed. Respondents also have the option of opting out of being recorded at this time. Respondents will need to click a checkbox indicating that they understand these rights and agree to be interviewed. If email recruitment is not meeting our recruitment goals, we will move to telephone calls.

Protocols: A copy of the draft round 1 interview protocol for moderated interviews is included.


Use of incentive: Monetary incentives for participation will not be offered.


Below is a list of materials to be used in the current study:


Attachment A: Draft protocol outlining intended questions to guide the moderated interviews for current and new JRFC/CJRP content


Attachment B: Draft instrument for current and new JRFC/CJRP content


Attachment C: Consent form example, including PRA/PA statements



Length of interview: For moderated cognitive interviews, we expect that each interview will last no more than 45 minutes (55 cases x 45 minutes per case = 41.25 hours). Additionally, to recruit respondents we expect to reach out via email and to make up to 3 phone contacts per completed case. The recruiting emails and calls are expected to last on average 3 minutes per call (3 attempts per phone call per completed case x 55 cases x 3 minutes per case = 8.25 hours). Thus, the estimated burden for the moderated cognitive interviews 49.5 hours.


The online survey with cognitive probes will be administered to up to 200 respondents in the second round and will take approximately 20 minutes to complete (200 cases x 20 minutes per case = 66.67 hours). This results in a total burden of 116.17 hours.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Hillary Steinberg

Data Collection Methodology and Research Branch

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-5383

Hillary.steinberg@census.gov


cc:
Elizabeth Willhide (Census, ERD) with enclosures

Sabrina Webb (Census, ERD) “ ”

Megan Minnich (Census, ERD) “ ”

Jonathan Albers (Census, ERD) “ ”

Benjamin Adams (DOJ) “ ”

Kaitlyn Sill (DOJ) “ ”


1 Campanelli, P. 2007. “Methods for Testing Survey Instruments.” Short Course, Joint Program in Survey Methodology (JPSM). Arlington, VA.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy