Memo to OMB on Cognitive and Pilot Test for NCVS Redesign

NCVS redesign pilot test - Memo requesting generic clearance_with cog testing update 5.28.19.docx

Generic Clearance for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities

Memo to OMB on Cognitive and Pilot Test for NCVS Redesign

OMB: 1121-0339

Document [docx]
Download: docx | pdf




U.S. Department of Justice


Office of Justice Programs


Bureau of Justice Statistics

Washington, D.C. 20531


MEMORANDUM



To: Robert Sivinski

Office of Statistical and Science Policy

Office of Management and Budget


Through: Melody Braswell

Clearance Officer

Justice Management Division


Jeffrey H. Anderson

Director

Bureau of Justice Statistics


From: Jennifer Truman

Bureau of Justice Statistics


Date: May 28, 2019


Re: BJS Request for OMB Generic Clearance for Cognitive Testing and a Pilot Test under the Generic Clearance for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities, OMB Number 1121-0339



The Bureau of Justice Statistics (BJS) requests clearance for a pilot test of field procedures under generic clearance agreement, OMB Number 1121-0339, for activities related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program. BJS, in consultation with Westat under a cooperative agreement (Award 2013-MU-CX-K054 National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project), is working to redesign the NCVS survey instrument and test alternative modes of administration. The NCVS was last redesigned in 1992. Much has changed in the interim, both in the level of public acceptance of surveys and in terms of the nature of crime. The primary purpose of the NCVS Instrument Redesign and Testing Project is to provide scientific and technical support for the redesign and testing of the NCVS roster control card, crime screener (NCVS-1), and crime incident (NCVS-2) instruments in support of BJS’s efforts related to increasing the efficiency, reliability, and utility of the NCVS.


Prior generic clearances for this project (OMB Number 1121-0325) allowed for cognitively testing different sections of the NCVS instrument with adults and youth, which led to a revised NCVS instrument. Another generic clearance provided for usability testing regarding the navigation and flow of the web-based instrument. Following that testing, BJS decided to pursue an interviewer-administered version of the instrument, and has requested an experimental comparison of the interviewer-administered instrument currently fielded by the Census Bureau, the redesigned interviewer-administered version, and the redesigned self-administered version developed as part of the prior generic clearances.


This request covers clearance for one additional round of cognitive testing, and a small-scale pilot test of data collection field procedures and testing of the interviewer-administered version of the redesigned instrument alongside the current NCVS instrument. Under separate cover, BJS will request full clearance for a field test with an experimental design comparing three versions of the instruments – interviewer-administered versions of the current and redesigned NCVS and a self-administered version of the redesigned questionnaire. That field test will involve administering the survey to a representative sample of persons age 12 or older, testing aspects of the design such as mode, victimization-screener approaches, response rates, and administration times. BJS plans to request clearance for the full field test in the spring of 2019. The pilot test and field test will use the title “National Survey of Crime and Safety” (NSCS) to distinguish the tests from the ongoing National Crime Victimization Survey.


Description and Purpose of Overall Project


This request is for generic clearance to conduct a final round of cognitive testing and a small-scale pilot test of procedures for the interviewer-administered versions of the NCVS (current and redesigned) to be used in the larger field test. This pilot will inform the field protocol and training needs for the larger field test. The final round of cognitive testing will test additional changes that were made to the instrument following usability testing in late 2018. Before describing the proposed methods for the pilot, this section will describe the goals and research questions for the larger NCVS Instrument Redesign and Testing Project.


The NCVS is based on research conducted by the Department of Justice and the U.S. Census Bureau in the 1970s (summarized in Lehnen and Skogan, 1981; Skogan, 1990; Skogan and Lehnen, 1985). There was a major redesign implemented in 1992, motivated in part by a National Academy of Sciences (NAS) review (Penick and Owens, 1976). A more recent review by the NAS (Groves and Cork, 2008) provided similar motivation for the current BJS redesign effort.


Since 2008, BJS has initiated a number of research projects to assess and improve upon core NCVS methodology, including redesigning the sample plan, comparing alternative modes of interviewing, testing methods for reducing non-response bias, experimenting with various reference period lengths, testing the effectiveness of new victimization screening questions, and exploring various sub-national estimation methods.


The current NCVS Instrument Redesign and Testing Project is part of BJS’s work (with Westat under a cooperative agreement) to develop a new design for the NCVS. The overarching objective for this project is to redesign and test the NCVS roster control card, crime screener (NCVS-1), and crime incident report (NCVS-2). Ultimately, BJS aims to evaluate and modernize the organization and content of the NCVS; improve the efficiency of the instruments and the current core-supplement design; and develop a procedure for introducing routine improvements to the survey in order to capture emerging crime types and time-relevant topics.


A first step in the NCVS Instrument Redesign and Testing Project was a comprehensive assessment of the instruments to determine which survey items are being utilized and how by BJS and NCVS data users; which survey items are problematic in their language and placement; and where there are gaps in the content of the instrument. The initial assessment, which was completed within the first year of the project, provided an understanding of the substantive and methodological issues with the instrument and helped to identify areas where improvements to the content would enhance current knowledge of victimization and its correlates.


One major focus of the project has been on revising and updating the content of the survey. This activity includes increasing the relevance and utility of the crime incident report (CIR); building in a series of questions on perceptions of police and community safety to be asked of all respondent (‘ask all’ items); adding questions on correlates of victimization; revising and expanding questions on victim services; revising the questions on immediate reactions by the victim; and adding vandalism as a new type of crime.


A second major focus of the NCVS Instrument Redesign and Testing Project has been on streamlining the screening questions, which ask respondents to report whether they experienced various types of crime victimizations during the last six months.1 The screener, first implemented in 1992, incorporates a wide range of verbal cues and examples to prompt recall of victimizations. In addition, it organizes the questions in a “blocked” format; in this format, all of the screening items are administered prior to any of the associated follow-up questions that gather more detail about each incident.


Over time, evidence has accumulated to indicate that the approach taken in 1992 may not be working as well as intended. For example, it is apparent from time stamp data and from direct observation of NCVS interviews that interviewers often go through the examples in the screening questions very quickly or skip them entirely, sometimes at the insistence of the respondents. This is especially the case after the first time-in-sample, after the respondent learns what is in the survey. Further, the “blocked” organization of the screening items may be less effective in a longitudinal setting (the NCVS interviews respondents up to seven times) than in a cross-sectional context, since respondents may learn the connection between answers to the screening items and the administration of a large number of follow-up questions. Intermixing at least some of the follow-up questions with the initial screening items (an approach called “interleaving”) may offer advantages over the blocked approach—producing a more conversational flow to the questions and improving the routing to later items.


A second improvement to the screening was updating the items targeting rape, sexual assault and intimate partner violence. These improvements were made based on prior research and recommendations to measure these highly sensitive crimes (Kruttschnitt, C., Kalsbeek, W.D., & House, C.C. (2014). Estimating the incidence of rape and sexual assault. Washington, DC: National Academies Press.). In addition to changing the screening items, the CIR was modified to improve the classification of these types of incidents.


A third major focus of the NCVS Instrument Redesign and Testing Project has been to design a self-administered web version of the redesigned survey. One advantage of a web instrument is that it can increase the privacy of the interview relative to the interviewer-administered version. This may be important for crimes such as rape and sexual assault, and those involving an intimate partner or relative. A second advantage may be that web administration can significantly decrease the costs of the survey. A possible disadvantage is that the absence of a field representative may reduce the level of interest in the survey, reduce response rates, and raise concerns about the quality of the self-reported data.


To date, the NCVS Instrument Redesign and Testing Project has completed four rounds of cognitive testing and two waves of usability testing of the self-administered instrument. The cognitive testing focused on the redesigned screener, redesigned CIR, and the series of questions on perceptions of police and community safety. The cognitive testing included adults in all four rounds and youths in the last two rounds, with special attention in those last two rounds given to youth comprehension of the screening items, and the redesigned parts of the CIR.


The two waves of usability testing examined how the visual presentation of survey questions, instructions, and supplemental information affect users’ navigation and understanding of the self-administered instruments. Research has shown that these features can have a significant effect on the time required to complete the survey questions, on the accuracy of question-reading and data entry, and the full usage of resources available to help the user complete his or her task.2


The current clearance request covers the final cognitive testing and pretesting needed in order to maximize the success of the field test scheduled for later in the fall of 2019. The larger field test will compare three different conditions: 1) the current NCVS instrument (interviewer-administered), 2) the redesigned instrument administered by an interviewer, and 3) the redesigned instrument using a web mode. The web condition will lag the first two conditions when going to the field. The larger field test will conclude the NCVS Instrument Redesign and Testing Project.


Current Request for Cognitive Testing


The final round of cognitive testing will be conducted in-person with 15 adults who have been the victim of a crime in the past 12 months. The cognitive testing will focus on instrument changes that were implemented after the two rounds of usability testing, including changes to some of the community perception items (asked of all respondents); modifications to the interleaving and non-interleaving screening approaches; updates to the CIR in the sexual assault, hate crimes, police involvement and victim services modules; and changes to some of the person-level demographic items, including new items on disability status and race/ethnicity.


In order to gather sufficient feedback on this diverse set of items, all respondents will have been the victim of a crime in the past 12 months. We will recruit a mix of victims of different types of crimes, including at least 5 who have experienced unwanted sexual activity and 5 who have been the victim of an attempted or completed assault in the past 12 months.


Cognitive interviewing will be used to identify potential problems with each version of the questions. Trained cognitive interviewers will administer the entire questionnaire in an interviewer-administered mode. They will ask concurrent probes to gather additional information about comprehension and the response process. (See Attachment 1 for the cognitive protocol that includes the subset of items that will be cognitively probed.) Some probes will seek to determine how the participant understands specific terms and concepts, such as “other physical consequences” or “disabled.” Other probes will be used to determine whether the CIR items accurately capture what happened and whether the participants understood the questions easily and as intended. Interviewers will also use unscripted probes if the participant shows signs of difficulty, confusion, or frustration (e.g., “You seem to be having trouble with this question. Can you tell me what the problem might be?”).


All interviews will take place at the Westat offices in Rockville, MD. Interviews will last approximately 90 minutes to allow time for the administration of the screener and detailed crime incident items, as well as cognitive probing. Participants will receive $60 to encourage participation and offset the cost of their participation, such as transportation, parking, and childcare.


Recruiters will advertise the study to solicit participation, using internal recruiting databases and Craigslist ads. Interested participants will respond to a web-based recruitment screener to determine their eligibility. (See Attachments 2 and 3 for the recruitment screening questionnaire and the Craigslist ad.) Those selected to participate will be contacted by the recruiters and scheduled for their interview session. Individuals will be selected to the extent possible to achieve diversity by age, sex, educational attainment, race, and ethnicity across the interviews.


All cognitive interviews will be audio-recorded with the participant’s consent (See Attachment 4 for informed consent document). The audio recordings will only be accessible to project staff directly working on the project and no names or other personally identifying information (other than the participant’s voice itself) will be included in interviewer summaries of the audio recordings.


Current Request for Pilot Testing


The pilot test will use a convenience sample based on the locations of available and experienced Westat interviewing staff. BJS will target completion of 100 person-level interviews in each version of the interviewer-administered instrument. The interview sequence includes the following components:

  • Mailing of advance contact materials. Within two weeks of the start of data collection, approximately 75% of the households sampled for participation in the pilot will receive an advance letter that provides high-level description of the study and includes “Frequently Asked Questions” on the back, and a study brochure (Attachments 5 and 6, respectively). Based on the number of completed person interviews after the first week to 10 days of data collection, all or some portion of the remainder of the addresses will receive the advance materials. This two-stage approach is preferred in case we reach our target of 100 person-level interviews per condition with the initial set of addresses, and do not need to contact any further households.

  • Household enumeration component. This component enumerates all household members by collecting name, date of birth or age, and sex. For youth 12 to 17 years old, the component will also identify a parent or guardian living in the household. Based on the responses to this component, the instrument will identify all persons in the household 12 or older and create person-level interview tasks for each. See Attachment 7 for the Enumeration instrument specifications. This component is the same for both pilot-study instruments, and largely reflects the current NCVS enumeration content. (The specifications do include some modifications needed for the self-administered version of the person-level component, though that component is not included in this pilot test.)

  • Person-level consent component. This component directs the interviewer’s review of the consent materials with the household member. At the completion of the review, the interviewer will record whether consent was given or refused. For youths 12 to 17 years old, the instrument will require that a parent/guardian give permission for the interviewer to speak to the child about participating, prior to asking the child for assent. See Attachments 8, 9 and 10 for the adult consent, parental/guardian permission to speak with their child, and youth assent programmer specifications. This component is also the same for both pilot test instruments.

  • Person-level interview component. At a high level, the person-level interview for both conditions includes a series of crime victimization screener items intended to prompt recall of crimes experienced during the reference period; and for each crime reported, a set of questions collecting details about the incident, referred to as the CIR. For the household respondent, both versions of the instrument will also ask a series of screener items intended to prompt recall of household-level crimes such as property theft and burglary, and an associated CIR. The redesigned version of the instrument will also include a set of “non-crime” questions pertaining to residents’ perception of safety, disorder, police legitimacy, and satisfaction with police. They will be asked of all persons regardless of whether they experienced any victimization during the reference period. The Census Bureau’s instrument specifications for the current NCVS interviewer-administered person-level questionnaire is shown in Attachment 11. Attachment 12 shows the specifications for the redesigned interviewer-administered version of the NCVS.

  • Person-level debriefing: After each person completes their interview, the interviewer will ask the respondent a few additional debriefing questions that collect evaluative information about the interview. The debriefing includes a few items about burden and clarity of the interview questions, as well as a few items the interviewer completes on their own about the situation in which they completed the interview. These data will inform potential updates at the point BJS decides to implement changes to the current NCVS fielded by the Census Bureau. Attachment 13 shows the specifications for the debriefing items asked of each person.


Westat will train 24 experienced field interviewers and 2 field supervisors for the pilot test. Half of the field interviewers will receive training on the current instrument, and the other half will receive training on the redesigned version. All interviewers will have supervisors dedicated to the same condition they were assigned. The experimental design of the larger field test also plans to assign interviewers and supervisors to only one of the two interviewer-administered conditions. Using that same approach for the pilot test will provide information about this field structure and resulting data collection efficiency.


Current Westat field interviewers from the D.C. metropolitan area, Atlanta, Charlotte, Cleveland, and Allentown will be asked to participate. Westat statisticians will draw a sample of approximately 800 addresses located in the same county as the interviewers, or in contiguous counties. The field period will start July 22 and continue through August 23, 2019. Interviewers will target completion of up to 100 person-level interviews per condition during that four-week period. The data collection period is driven by the schedule set for the larger field test and the amount of time needed to accommodate any necessary updates to training materials, field procedures and materials, or the instruments identified during the course of the pilot test. There are some instrument routing changes to a few questions and also two new questions that were unable to be programmed in time for the scheduled pilot test. Delaying the pilot test would have significant implications on any necessary changes to the field test instruments and protocols, the overall schedule for the field test, and project costs. The two new questions that are planned for the larger field test will be included in the final round of cognitive testing (Attachment 1). For more information on the unprogrammed routing changes, please see Attachment 14, specifically the highlighted portions. All updates are planned for the subsequent field test.


This pilot test will also provide an opportunity to evaluate different audio recording procedures. For the field test planned for the fall of 2019, Westat intends to use audio recordings to detect potential falsification and validate interviews, as well as to edit survey data based on responses to an open-ended prompt for a description of a reported victimization. The software application used for developing the redesigned instrument does not currently include a native recording capability. For the pilot-test respondents who are administered both the current instrument and redesigned instrument and consent to audio recording, we will evaluate recordings captured via a mobile phone placed close to respondents, as compared to recordings captured through the recorder on the laptop. The recordings captured via either of these methods will include the audio from the entire person-level interview resulting in potentially large files. The pilot test will provide information about potential storage and transmission issues with these files, as well as provide data regarding differential audio quality across the two recording methods. All recordings will be destroyed at the end of the project.


For quality assurance purposes, field supervisors will make follow-up calls to adults in selected households in order to validate the data collection. For the pilot test, two households will be re-contacted for each interviewer. Supervisors will only contact adults for validation purposes, never youths. See Attachment 15 for the questionnaire used by field supervisors for validation.


As part of the evaluation of the field procedures and training materials, BJS will conduct an interviewer debriefing per instrument condition at about the third week of the field period. The debriefing sessions for each condition will include 5 to 8 interviewers. The debriefings will focus on the following topics –


  • Respondent reasons for non-participation;

  • Components of the interview for which training did not adequately prepare them, or for which training seemed unnecessarily detailed given their experience;

  • Issues encountered with either recording method, including understanding of the recording protocol and respondents’ willingness to be recorded;

  • Confusion or issues with any field protocol rules;

  • Confusion or issues with the field management system; and

  • Challenges encountered with the instrument, and overall perceptions of instrument performance.


The pilot test will use two approaches for gathering evaluative information about the interview instrument. During the data collection period, field interviewers will report questions and issues regarding interview content to their field supervisors. Supervisors will track and submit those reports to project staff on a flow basis. The reports will provide input on potential changes for the field test later in the fall. In addition, as interviewers transmit completed cases to the home office, analysts will review the data for unexpected data patterns and high item-level missing data rates signaling potential bugs in the application code, or problematic questions. This information will also inform potential changes for the field test later in the fall.


Language


All interviews will be conducted in English.


Burden Hours for Testing


The burden for the cognitive testing task consists of participants being screened for and subsequently participating in in-person cognitive interviews. The burden associated with these activities is presented in the table below.


The burden for the pilot test task consists of respondent participation in an interview using one of two instrument versions collecting information about crime victimization they may have experienced during the reference period. The interview for the household respondent will include an additional household enumeration task, as well as responses to questions regarding household types of crimes such as property theft and burglary. The interview burden is equivalent across the two versions of the instrument as shown in the following table, and estimates of burden are consistent with the burden estimates of the current NCVS fielded by the Census Bureau.





Instrument Version

Total Respondents

Average Administration Time (minutes)

Burden (hours)

Cognitive Test

15

90

22.5

Control Condition (Current NCVS)

100

25

42

Redesign Version

100

25

42

TOTAL

200

25

84



Justification of Respondent Burden


Every effort has been made to minimize respondent burden and government cost in the conduct of the proposed cognitive test and pilot test. The sample size for the cognitive test reflects a need to recruit victims of different types of crimes in order to gather feedback on different items within the crime incident report. The sample size for the pilot test reflects the need to gather data from multiple interviewers each of whom have completed close to 10 cases and thus have adequate experience to speak to field protocol and training issues specific to this survey.


Cost to the Federal Government


The total cost of conducting the cognitive testing will be approximately $15,625, including incentive costs. The total cost of conducting the pilot test will be approximately $361,000 under the cooperative agreement with Westat for the National Crime Victimization Survey (NCVS) Instrument Redesign and Testing Project.


Data Analysis


For the cognitive testing, interviewers will summarize the findings from each cognitive interview. Interviewers will prepare summary findings on each completed interview based on the completed questionnaire, notes taken during the interview, and associated audio recordings. The summaries will be analyzed using qualitative data analysis software to help identify common themes organized by overall questionnaire issues, individual questionnaire items and sections, and participants’ overall reactions to the questionnaire.


Upon completion of all cognitive testing, a draft cognitive testing report will be prepared that will include recommendations for final revisions to the survey for the field test. These recommendations will provide detailed information on the cognitive testing methodology, basic characteristics of the respondents, and any issues with question comprehension.


Pilot test data will not be weighted, and will not be used to make estimates of victimization. Evaluation will focus on —


  • Contact and cooperation rates;

  • Interviewer and supervisor reports of problems with the questionnaires, other materials, or procedures;

  • Listening to interview recordings;

  • Review of data frequencies to identify issues with skip patterns or item nonresponse;

  • Comparison of narrative descriptions of victimization incidents with question responses and resulting type-of-crime classification.


The goal of these evaluations will center on improving interviewer training and field procedures for the field test, and on correcting any issues with the questionnaire programs.


Protection of Human Subjects


There is some risk of emotional distress for the respondents given the sensitive nature of the topics, particularly since the questions are of a personal nature; however, appropriate safeguards are in place. All cognitive interviewers, field interviewers, supervisors, and field directors will receive training in a distress protocol reviewed and approved by Westat’s Institutional Review Board (IRB), which has federal-wide assurance.


With the distress protocol, interviewers will be trained to recognize when respondents are becoming emotionally upset (See Attachment 16 for the distress protocol). They will also be trained on how to react when they are upset. Any respondent who seems to be in distress will be asked if they wish to stop the interview. All respondents will receive a list of toll-free hotline numbers before leaving the interview session, regardless of whether the interviewer identified any distress (See Attachment 17).


Informed Consent


Adult participants in both the cognitive testing and the pilot test will review the informed consent document with the aid of the interviewer and if consent is provided, the interviewer will select a checkbox within the instrument consent component indicating consent was provided. Should an individual refuse informed consent, they will be excused from participation and thanked for their time. The interviewer will record the decision by selecting the checkbox indicating consent was refused within the instrument consent component. At the time of consent, the interviewer will also ask permission to audio record the interview. (See Attachments 8, 9 and 10 for the informed consent and assent documents for adults and youth for the pilot test, and Attachment 4 for the informed consent document for the cognitive test.)


To interview youth participants in the pilot test, BJS must have both parental/guardian permission and youth assent. (Youth are not included in the cognitive testing.) The instrumentation will require parental/guardian permission to speak with the child before making the request for youth assent available to the interviewer. As with adult consent, the interviewer will document parental permission and youth assent or refusal within the instrument consent component on the laptop. Should a parent or a child refuse informed consent, the child will be excused from participation and thanked for their time. As with the adults, interviewers will ask for permission to audio record the interview as part of both the parental permission and youth assent procedures. Both parent and child will receive a copy of the permission/assent form for their records.


Use of Information Technology to Reduce Burden


The cognitive testing study will utilize technology to facilitate recruitment and the scheduling process while also reducing participant burden and controlling study costs. Recruitment efforts will use email communications when possible, because participants increasingly prefer to communicate via email so they can respond when it is convenient. Using email for recruitment and scheduling can help to reduce participant burden and save time and money that would otherwise be spent conducting telephone calls, leaving voice messages, and making call-backs. Cognitive interviewers will navigate through the questionnaire using the same Computer Assisted Personal Interviewing (CAPI) instrument as described below.


Data collection for the pilot test will use CAPI instruments that include response driven skip patterns that minimize the presentation questions not relevant to the types of crimes reported by respondents. For example, for respondents who report no victimizations, the CAPI instruments will skip them completely out of the detailed CIR. All addresses at which interviewers have made multiple unsuccessful contact attempts will receive an email and toll-free phone number to contact to facilitate scheduling of an interview. Similarly, interviewers will comply with respondent requests to use text or email for follow-up communication regarding scheduling of person-level interviews as needed after completion of the household enumeration component.


Data Confidentiality and Security


BJS is authorized to conduct this data collection under 34 U.S.C. § 10132. BJS will protect and maintain the confidentiality of personally identifiable information (PII) to the fullest extent under federal law. BJS, its employees, and its contractors (Westat staff) will only use the information provided for statistical or research purposes pursuant to 34 U.S.C. § 10134, and will not disclose respondent information in identifiable form to anyone outside of the BJS project team. All PII collected under BJS’s authority is protected under the confidentiality provisions of 34 U.S.C. § 10231. Any person who violates these provisions may be punished by a fine up to $10,000, in addition to any other penalties imposed by law. Further, per the Cybersecurity Enhancement Act of 2015 (6 U.S.C. § 151), federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.


Participation in the cognitive testing and the pilot test is voluntary. Personally identifiable information (PII), including the audio recordings, will be encrypted and stored on the laptops or mobile phones temporarily until transmission to the Westat home office. All files will remain encrypted and compressed for transmission to the Westat home office. All PII will be securely stored in password-protected files to which only project staff will have access, and will be destroyed after the study is finished per contract requirements.


The procedures proposed for this study have been reviewed and conditionally approved by Westat’s IRB, and data collection will not begin until final IRB approval is received.


1 Note that the field test instruments will use a 12-month reference period, to increase the number of victimizations reported and hence the number of crime incident reports (CIRs) available for analysis.

2 Couper, M., 1999. The Application of Cognitive Science to Computer Assisted Interviewing, in Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., and Tourangeau, R. (eds.), Cognition and Survey Research, New York, Wiley, 277–300

9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy