CMEC_SupportingStatement-Part B final

CMEC_SupportingStatement-Part B final.docx

Census of Medical Examiner and Coroner Offices

OMB: 1121-0296

Document [docx]
Download: docx | pdf

Part B. Collection of Information Employing Statistical Methods


  1. Universe and Respondent Selection


The 2023 Census of Medical Examiner and Coroner Offices (CMEC) will use procedures successfully employed in the prior administrations of 2004 and 2018 CMEC to identify the universe of eligible offices. The universe is defined as any office that conducts medicolegal death investigations (MDI) for the jurisdiction it serves. To be eligible for inclusion in the CMEC, the following must be true of a medical examiner or coroner office: 1) investigates to determine a final cause and manner of death; 2) has authority to sign death certificates; and 3) has authority to request autopsy as well as determines when autopsies should be performed, even if the autopsy is performed outside of the office in an external autopsy facility. The 2023 CMEC will also include Texas justices of the peace that perform MDI in addition to their other court-derived responsibilities.


RTI has used the National Directory of Law Enforcement Administrators (NDLEA) database to update the census frame for the 2023 CMEC. The NDLEA contains the contact information for local, state, and federal law enforcement agencies, including coroners and medical examiners. New information from this database has been combined with the frame from the 2018 CMEC to produce a current universe list of approximately 2,300 medical examiner and coroner (MEC) offices. The 2023 CMEC will be expanded to also include an additional 700 justices of the peace in Texas who conduct death investigations. Any duplicates among the approximately 3,000 offices in the 2023 CMEC frame will be removed based on identical or nearly identical addresses, phone numbers, and/or names of the coroner or chief medical examiner. Frame verification (as approved under the OMB generic clearance approval 1121-0339) with email and telephone outreach has been conducted to confirm or collect contact information for the CMEC frame.


As in previous administrations, the 2023 CMEC will be a census rather than a sample survey. The reasons for this decision include the following:


  • The eligible population is expected to be approximately 3,000 agencies. Moving to a sample survey with a universe of this size will not result in significant cost savings given the stratification dimensions needed to capture critical aspects of the universe, such as size and jurisdictional characteristics. The MDI system is such a well-known constellation of disparate agencies that it has been called a “patchwork” in the forensic professional, scientific, and popular literatures.


  • A census provides BJS with an opportunity to show how MEC offices vary across and within states. Being able to compare MEC offices is particularly important considering the variability that exists among these organizations in terms of administration, caseload, policies, procedures, resources, staffing, and infrastructure. These differences were critical findings in the 20041 and 20182 CMEC.


  • Other federal, state, and local agencies are interested in CMEC data because the data can be used to describe the programs and infrastructure needs. With a census design, these data can be used to support expansion and enhancement of MEC offices through funding from the Department of Justice and other sources.


  • A census provides an opportunity to build a foundation for conducting future surveys of MEC offices by other federal agencies. Completing the CMEC, for example, will provide the information necessary to produce samples based on a more comprehensive and fuller understanding of how each MEC operates given the variability that exists within and across states. Attempting to generate samples of MEC offices without this crucial information would be more time intensive and costly.


  • While not essential for national estimates, the small increase in effort to conduct a census over a sample would allow BJS to report on MEC offices in all 50 states and Washington DC.


  1. Procedures for Collecting Information


Data Collection Procedures


The 2023 CMEC will be a multi-mode data collection approach using web as the primary mode, with hard copy data collection as an alternative (see Attachment 1 for the full survey and example screenshots of web instrument), and mail and telephone follow-up as needed. The data collection period will last approximately ten months and will involve initial invitations, several reminders, and an end-of-study letter. In addition, there will be data quality and non-response follow-up (see Table 6 for schedule).















Table 6. Outreach schedule for CMEC

Stage

Timing

Type of contact

Attachment number(s)

Survey invitation letter (with URL and login instructions), endorsement letter

Week 1

All

6, 7

Email invitation (with URL and login instructions), and endorsement letter posted to web and linked

Week 2

All

8

First reminder letter

Week 5

Non-respondents

9

Data Quality follow-up phone

Week 6

Respondents with errors

18

Second reminder email

Week 6

Non-respondents

10

Third reminder postcard

Week 9

Non-respondents

11

Telephone prompting round 1

Week 11

Non-respondents

19

Fourth reminder letter UPS with questionnaire and business return envelope

Week 12

Non-respondents

12

Fifth reminder email

Week 14

Non-respondents

13

Telephone prompting round 2

Week 17

Non-respondents

19

Sixth reminder email

Week 20

Non-respondents

14

Seventh reminder letter with questionnaire and business return envelope

Week 22

Non-respondents

15

Eighth reminder letter

Week 26

Non-respondents

16

Ninth reminder email

Week 27

Non-respondents

17

Critical items letter and survey sent with business return envelope; login credentials provided

Week 31

Non-respondents

21

Critical items email

Week 32

Non-respondents

22

Critical items prompting calls

Week 34

Non-respondents

19

End-of-study letter

Week 39

Non-respondents

23

End-of-study email

Week 40

Non-respondents

24

Close data collection

Week 42



Completion thank you email

Rolling

Respondents

25


The data collection will begin with an invitation letter mailed via the United States Postal Service (USPS) to the head of each office to inform them about the survey. As new respondents to the CMEC, the justices of peace in Texas will receive a slightly different letter than the MEC offices (see Attachment 6). The survey invitation letter will be signed by the BJS Acting Director and will stress the purpose and importance of CMEC and the need for participant’s cooperation. Further, it will notify the recipient of the survey due date, provide instructions for submitting the survey online, and will provide the project Help Desk email address and telephone number should there be any questions. Included with the invitation will be a letter of support from the International Association of Coroners and Medical Examiners (IACME) (Attachment 7). About 3 business days after the mailed invitation letter is sent, an email invitation and CMEC flyer (see Attachment 8) will be sent to those directors/designees for whom an email address is available. This invitation will closely align to the mailed invitation letter and will contain a hyperlink to the web survey.


Approximately three weeks after the invitation is sent, we will mail the first mail reminder letter to nonrespondents (Attachment 9). Beyond the initial reminder letter, over the course of approximately ten months, a total of nine reminder mailings and emails (see Attachments 10-17) will be sent to those who have not yet responded and will alternate between mediums to keep the survey reminders fresh to the respondents. Both the mail and email reminders will contain information for completing the web survey. However, as we anticipate some might prefer to respond using a hard copy questionnaire, three mailed reminders will contain a paper questionnaire and a business reply envelope for easy return.


Approximately five weeks after data collection begins, RTI will begin reviewing the data received. As data discrepancies or missing data values are discovered, RTI staff will follow up with respondents via telephone or email to clarify responses or obtain missing information (Attachment 18).


Approximately three weeks before the survey due date, telephone and email follow-up with nonrespondents will begin as described earlier (see Attachment 19). Respondents will be reminded of the purpose and importance of the survey and informed of the goal of receiving a completed survey from each office. They will be asked to submit the survey online but will be sent another hard copy version of the survey if requested. Up to 8 calls will be made by RTI until surveys are received (or an office refuses to participate) and will reference the most recent communication (e.g., reminder letters, reminder emails, etc.).


By month 6, BJS will determine if a shorter critical item survey is needed to bolster response rates. During the 2004 and 2018 CMEC, there were about a dozen questions that were deemed to be “critical” for BJS to obtain, including staffing, budget, and workload data; the abbreviated survey accounted for 18.6% of returned 2018 CMEC surveys. BJS has already identified the critical items that would be needed for the 2023 administration (Attachment 20). Should this effort be necessary, we will begin critical item capture following the ninth reminder email. The abbreviated paper and pencil instrument (PAPI) survey would be addressed to the office head with an invitation to fill out the shortened survey on the web or via paper response via the USPS (Attachment 21). This mailing would include a business return envelope to facilitate paper response. An email would be sent to all agency heads 3 business days after the abbreviated survey packet was sent. Within the letter, the login credentials and the Help Desk email and toll-free telephone number would be provided. Nonresponse follow-up that would incorporate critical item data capture would take place 3 weeks after that mailing. About 3 business days after the critical items letter is mailed, we will email a critical items message to the CMEC contact (Attachment 22). Using this approach, when the 2018 CMEC ended data collection, there was an overall response rate of 80.7% and 51.0% responded via the web.


RTI will send the end-of-study notification both via mail and email to notify nonrespondents that the study is coming to an end and that their response is needed within two weeks (Attachments 23-24). Data collection will continue for approximately three more weeks to allow for receipt of the remaining questionnaires.


Immediately after surveys are submitted respondents receive a thank you email (Attachment 25). These emails will thank them for the time and effort necessary to complete the survey. The text will formally acknowledge receipt of the survey and state that the agency may be contacted for clarification once their survey responses are processed.


Data Processing


Upon receipt of a survey (web or hardcopy), data will be reviewed and edited, and if needed, the respondent will be contacted to clarify answers or provide missing information. Respondents who submit via web will be prompted with real-time validation checks when submitting missing or inconsistent data. Any unresolved items that remain after the respondent submits will result in recontact by RTI staff to the respondent to attempt to resolve these issues.


The hardcopy survey will be developed and keyed using TeleForm, which will allow the surveys to be scanned and the data read directly into the same database containing the web survey data. This will ensure that the same post-collection data quality review procedures, which mirror and expand upon the web validation checks, are applied to all survey data, regardless of response mode. The following is a summary of the data quality assurance steps that RTI will take during the data collection and processing period:

Data Editing. RTI will reconcile missing or erroneous data through automated and manual edits of each questionnaire. In collaboration with BJS, RTI will develop a list of edits that can be completed by referring to other data provided by the respondent on the survey instrument. For example, if a screening question was left blank, but the follow-up questions were completed, a manual edit could be made to indicate the intended positive response to the screening question. Through this process, RTI can quickly identify which hardcopy cases require follow-up and indicate the items that need clarification or retrieval from the respondent.

Data Processing. When the project team identifies a potential data issue, such as missing or inconsistent answers, an RTI professional staff member will contact the respondent for clarification. Throughout the data retrieval process, RTI will document the critical questions needing retrieval (e.g., missing or inconsistent data elements), request clarification on the provided information, obtain values for missing data elements, and examine any other issues related to the respondent’s submission.

Data Entry. Respondents completing the survey via the web instrument will enter their responses directly into the online instrument. For those respondents returning the survey via hardcopy (mail or fax), data will be scanned once received and determined complete. To confirm that editing rules are being followed, RTI will review frequencies for the entered data after the first 10% of cases are received. Any issues will be investigated and resolved. Throughout the remainder of the data collection period, RTI staff will conduct regular data frequency reviews to evaluate the quality and completeness of data captured in both the web and hardcopy modes.


  1. Methods to Maximize Response Rates


Minimizing Nonresponse


The 2018 CMEC achieved an 81% response rate for critical items. BJS and RTI will undertake various activities to ensure that high response rates are again achieved for the 2023 CMEC. CMEC will use a web-based instrument supported by various online help functions to maximize response rates. A toll-free number will also be provided for both substantive and technical assistance. RTI staff will respond to these requests for assistance.


The previous CMEC administrations have enjoyed widespread support by the International Association of Coroners and Medical Examiners, which was enlisted to help with the development of the questionnaire and to encourage individual offices to respond to the survey. This continues to be the case for the 2023 CMEC (Attachment 7). The National Association of Medical Examiners, American Board of Medicolegal Death Investigators, National Sheriffs’ Association, and the Society of Medicolegal Death Investigators have also endorsed the 2023 CMEC.


The survey instrument was reviewed to ensure that it captured the most relevant information, removing any unnecessary questions to reduce burden. An item-level assessment of the 2018 CMEC was conducted to look for patterns of non-response. Items with lower response rates were dropped. In addition, the questionnaire was reviewed for ease of use, flow, and additional survey methodology best practices to ensure ease in administration by expert panel reviewers and by BJS and RTI.


To promote 100% item completion by respondents, RTI will monitor item response rates as surveys are submitted. RTI will have a survey management system linked to the web-based application that will flag missing items and invalid responses. RTI will also flag missing items on hard copy submissions on a flow basis. The data collection manager will oversee phone and email outreach to respondents to clarify missing or invalid responses and to take corrective action. Changes to survey responses obtained through this follow-up effort will be tracked and entered in the data collection database.


Adjusting for Nonresponse. With any survey, it is typically the case that some of the units, in this case MEC offices, will not respond to the survey request (i.e., unit nonresponse) and some will not respond to particular questions (i.e., item nonresponse). Data will be checked as it is collected for completeness and logical consistency of responses. Imputation procedures, such as random hot deck imputation, will be used to address issues of item non-response.


As the 2023 CMEC is planned to be a complete census of coroners and medical examiners, sampling weights should not be necessary. However, in the event unit response rates are lower than anticipated, some weighting of the data may be required. The extent will depend on response rates within sub-groups of the respondent pool. Initially, response rates within jurisdiction size grouping, region of the country and coroner/medical examiner type will be reviewed to determine if a weighting adjustment is necessary.


Based on previous administrations, an overall response rate of at least 81% is expected. To ensure that nonresponding MEC offices are not fundamentally different than those that participate, a nonresponse bias analysis will be conducted if the agency-level response rate obtained in the 2023 CMEC survey falls below 85%. Administrative data on agency type, size, census region or division, and population served will be used in the nonresponse bias analysis as weighting classes. For each agency characteristic, RTI will compare the distribution of respondents to nonrespondents. Weights will be calculated such that the weights sum to the frame population totals within each weighting class. Because this is a census, design weights are effectively “1” and every office in the frame is eligible to participate in the census. Thus, the nonresponse adjustment weight will simplify to the inverse proportion of nonrespondents to respondents for every responding office in the weighting class.

  1. Final Testing of Procedures


The proposed new questions in the 2023 CMEC and the revisions made to those retained from 2018 were reviewed by BJS and RTI staff, suggested and discussed by the expert panelists, and cognitively tested. The cognitive testing provided insight into whether respondents fully understood questions, provided expected answers, informed our phrasing and response options, and provided an estimate for burden (Attachment 3). The instrument was modified to increase comprehension as a result of these interviews. In addition, RTI will thoroughly test the web-based survey administration system through systematic user testing, including testing skip patterns, attempting to “break” the instrument, and back-end data checks on entered responses.


The 2023 CMEC will maintain similar respondent recruitment and support procedures as the previous CMEC administration, which was field tested and successfully employed. RTI has utilized web-based survey instruments (that are substantially similar to the format in design for the 2023 CMEC) for other BJS surveys including the Law Enforcement Management and Administrative Statistics (LEMAS) and Census of State and Local Law Enforcement Agencies (CSLLEA) administrations. The web-based survey administration procedures successfully employed in the LEMAS and CSLLEA survey designs will be substantially retained but modified as necessary to accommodate the 2023 CMEC instrument and respondents.


  1. Contacts for Statistical Aspects and Data Collection


  1. BJS contacts include:

Matt Durose

CMEC Program Manager

202-598-0295

Matt.Durose@usdoj.gov


Alexia Cooper

Law Enforcement Statistics Unit Chief

202-307-0582

Alexia.Cooper@usdoj.gov


  1. Persons consulted on statistical methodology:


Heather Meier

RTI International


  1. Persons consulted on data collection and analysis:


Hope Smiley-McDonald

RTI International


Kelly Keyes

RTI International


Amanda Smith

RTI International


Ryan Weber

RTI International



1 U.S. Department of Justice. Bureau of Justice Statistics. Medical Examiners and Coroners’ Offices, 2004. NCJ 216756. https://bjs.ojp.gov/content/pub/pdf/meco04.pdf.

2 U.S. Department of Justice. Bureau of Justice Statistics. Medical Examiner and Coroner Offices, 2018. NCJ 302051. https://bjs.ojp.gov/content/pub/pdf/meco18.pdf

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authoradamsd
File Modified0000-00-00
File Created2024-08-03

© 2024 OMB.report | Privacy Policy