Collection of Information Employing Statistical Methods
The sample universe for the Identity Theft Supplement (ITS) is all persons aged 16 or older in all NCVS interviewed households. The NCVS sample of households is drawn from the more than 120 million US households nationwide and excludes military barracks and institutionalized populations. In 2021, the annual national sample is planned to be approximately 254,000 designated addresses located in 542 stratified Primary Sampling Units (PSUs) throughout the United States.
Frame
The Master Address File (MAF) contains all addresses from the most recent decennial census plus updates from the United States Postal Service, state and local address lists, and other address listing operations. The MAF is the frame for the target NCVS population. Every ten years, the Census Bureau redesigns the samples for all of their continuing demographic surveys, including the NCVS. In general, the purpose of these redesigns is to capture population shifts measured by the most recent decennial census. In 2015, the 2000 sample design started to phase out and the 2010 sample design started to be phased in. The phase-in and phase-out of the sample designs started in January 2015 and continued through December 2017. Beginning in 2016, some PSUs were removed from the sample, some new PSUs were added to the sample, and some continuing PSUs that were selected for both the 2000 and 2010 designs remained in the sample. The 2018 NCVS was the first full year of the phased-in 2010 design where all PSUs and addresses were from the 2010 design.
Rotating Panel Design
The NCVS uses a rotating sample. The sample consists of seven groups for each month of enumeration. Each of these groups stays in the sample for an initial interview and six subsequent interviews, for a total of seven interviews for the typical household. During the course of the 6-month period when the ITS will be administered, a full sample of seven rotation groups will be interviewed (one-sixth each month). One rotation group enters the sample for its first interview each month.
Sampling
The sample design for the NCVS is a stratified, multi-stage cluster sample. Sample selection for the NCVS is done in three stages: the selection of primary sampling units (PSUs), the selection of address units within sample PSUs, and the selection of persons and households from those addresses to be included in the sample.
Stage 1. Defining and Selecting PSUs
Defining PSUs - Formation of PSUs begins with listing counties and independent cities in the target area. For the NCVS, the target area is the entire country. The PSUs comprising the first stage of the sample are formed from counties or groups of adjacent counties based upon data from the most recent decennial census and the American Community Survey (ACS). The counties are either grouped with one or more contiguous counties to form PSUs or are PSUs unto themselves. For counties that are grouped, the groupings are based on certain characteristics such as total land area, current and projected population counts, large metropolitan areas, and potential natural barriers such as rivers and mountains. For the NCVS, decennial census counts, ACS estimates, and administrative crime data drawn from the FBI’s Uniform Crime Reporting (UCR) Program are also used to stratify the PSUs. The resulting county groupings are called PSUs.
After the PSUs are formed, the larger PSUs are included in the sample with certainty and are considered to be self-representing (SR). The remaining PSUs, called non-self-representing (NSR) because only a subset of them are selected, are combined into strata by grouping PSUs with similar geographic and demographic characteristics.
Stratifying PSUs - For the 2010 design, the NSR PSUs are grouped with similar NSR PSUs within states to form strata. Each SR PSU forms its own stratum. The data used for grouping the PSUs are also based on decennial census demographic data, ACS data, and administrative crime data. NSR PSUs are grouped to be as similar or homogeneous as possible. Just as the SR PSUs must be large enough to support a full workload, so must each NSR stratum. The most efficient stratification scheme is determined by minimizing the variance both between and within PSUs.
Selecting PSUs - The SR PSUs are automatically selected for sample or “selected with certainty.” NSR PSUs are sampled with probability proportional to the population size using a linear programming algorithm. One PSU is selected from each NSR stratum. The 2010 design NCVS sample includes 339 SR PSUs and 203 NSR PSUs. PSUs are defined, stratified, and selected once every 10 years. The 2010 design sample PSUs were sampled using population data from the 2010 census.
Stage 2. Preparing Frames and Sampling within PSUs
Frame Determination - The 2010 sample utilizes two dynamic address-based sampling frames, one for housing units (HUs) and one for group quarters (GQs). Both frames are based upon the MAF. The MAF is continually updated by various Census Bureau programs and external sources. New housing units are added to the MAF, and therefore the NCVS sampling frame, through semiannual updates from a variety of address sources, including the U.S. Postal Service Delivery Sequence File, local government files, and field listing operations.
In the 2010 design, each address in the country was assigned to the sampling frame based on the type of living quarters. Two types of living quarters are defined in the decennial census. The first type is a housing unit. A HU is a group of rooms or a single room occupied as separate living quarters or intended for occupancy as separate living quarters. A HU may be occupied by a family or one person, as well as by two or more unrelated persons who share the living quarters. The second type of living quarters is GQ. GQs are living quarters where residents share common facilities or receive formally authorized care. About 3% of the population counted in the 2010 Census resided in GQs. Of those, less than half resided in non-institutionalized GQs. About 97% of the population counted in the 2010 Census lived in HUs.
Within-PSU Sampling - All of the Census Bureau’s continuing demographic surveys, such as the NCVS, are sampled together. This procedure takes advantage of updates from the January MAF delivery and ACS data. This within-PSU selection occurs every year for housing units and every three years for GQs.
Selection of samples is done, sequentially, one survey at a time. Each survey determines how the unit addresses within the frame should be sorted prior to sampling. For the NCVS, each frame is sorted by geographic variables. A systematic sampling procedure is used to select addresses from each frame. A skeleton sample is also selected in every PSU. Every six months new addresses on the MAF are matched to the skeleton frame. The skeleton frame allows the sample to be refreshed with new addresses and thereby reduces the risk of under-coverage errors due to an outdated frame.
Addresses selected for a survey are removed from the frames, leaving an unbiased or clean universe behind for the next survey that is subsequently sampled. By leaving a clean universe for the next survey, duplication of addresses across surveys is avoided. This is done to help preserve response rates by insuring that no unit falls into more than one survey sample.
Stage 3: Persons within Sample Addresses
The last stage of sampling is done during the initial contact of the sample address during the data collection phase. For the NCVS, if the address is a residence and the occupants agree to participate, then an attempt is made to interview every person age 12 or older who lives at the resident address. After the NCVS questionnaire has been administered, those age 16 and older in the household are asked to complete an ITS interview. Only those who have completed an NCVS questionnaire will be interviewed for the ITS. The NCVS has procedures to determine who lives in the sample unit and a household roster is completed with names and other demographic information of all persons who live there. If someone moves out (in) of the household during the interviewing cycle, he or she is removed from (added to) the roster.
The expected NCVS sample size for July through December 2021 is 127,000 households. Based on 2019 NCVS response rates, we expect approximately 61% of households (or about 77,470 households) to be interviewed for the ITS, with an average of 1.85 persons (who are 16 years or older) in each interviewed household. Thus, we expect about 143,320 total eligible persons for the 2021 ITS.
Based on 2018 ITS data collection, we expect interviewers will be able to obtain ITS interviews for 73.2% of the ITS-eligible household members. A total of 104,910 individuals aged 16 and older are expected to be interviewed for the ITS during the 6-month collection period.
Weighting and Estimation
The purpose of the ITS is to be able to make inferences about identity theft victimization for the population of persons age 16 or older in the United States. Before such inferences can be drawn, it is necessary to adjust, or weight, the sample of people to ensure it is similar to the entire population in this age group. The ITS weights are a combination of household-level and person-level adjustment factors. Household and person respondents from the NCVS sample are adjusted on a bi-annual basis to represent the U.S. population age 12 or older. For the ITS, the population is restricted to persons age 16 or older.
NCVS household and person weights are first adjusted to account for any subsampling that occurs within large GQs. The NCVS nonresponse weighting adjustment then allocates the sampling weights of nonresponding households and persons to respondents with similar characteristics. Additional factors are then applied to correct for the differences between the sample distributions of age, race and Hispanic origin, and sex and the population distributions of these characteristics. The resulting weights were assigned to all interviewed households and persons in the NCVS file.
ITS weighting begins with the NCVS final person weight, which is then multiplied by an ITS noninterview adjustment factor. ITS noninterview adjustment factors were computed by distributing the weights of ITS noninterviews to the weights of the ITS interviews, with adjustment cells determined by age, race/Hispanic origin, and sex. The result is an ITS person-level weight that can be used for producing estimates from the ITS variables.
Variance Estimates
The NCVS and ITS estimates come from a sample, so they may differ from figures from an enumeration of the entire population using the same questionnaires, instructions, and enumerators. For a given estimator, the average squared difference between estimates based on repeated samples and the estimate that would result if the sample were to include the entire population is known as sampling error. The sampling error quantifies the amount of uncertainty in an estimate as a result of selecting a sample.
Variance estimates can be derived using direct estimation or generalized variance functions (GVFs). Replication methods provide estimates of variance for a wide variety of designs using probability sampling, even when complex estimation procedures are used. This method requires the sample selection, data collection, and estimation procedures to be carried out (i.e., replicated) several times. In addition, the Census Bureau produces parameters for GVFs that estimate the variance of the prevalence of police contact based on the value of the estimate. To do this, estimates and their relative variance are fit to a regression model using an iterative weighted least squares procedure where the weight is the inverse of the square of the predicted relative variance.
The ITS is designed to calculate national estimates of identity theft for the target population – the noninstitutionalized resident population age 16 years or older. The 2021 ITS will be administered to all age-eligible NCVS respondents during the 6-month period from July through December of 2021.
Data collection
For the six-month period, July through December 2021, the ITS will be administered to approximately 77,470 designated households. The NCVS uses a rotating sample that consists of seven groups for each month of enumeration. Each housing unit selected for the NCVS remains in the sample for three years, with each of seven interviews taking place at 6-month intervals.
The NCVS-500 (control card) is used to complete a household roster with names and other demographic information of the household members. For some demographic questions that are asked directly of respondents, flashcards are used, including for education, race and Hispanic origin, sexual orientation, employment, and household income. Respondents are asked to report victimization experiences occurring in the six months preceding the month of interview. The NCVS crime screener instrument (NCVS-1) is asked of all respondents age 12 or older in the household and is used to ascertain whether the respondent has experienced a personal crime victimization during the prior six months and is therefore eligible to be administered the NCVS crime incident report instrument (NCVS-2). The NCVS-1 collects the basic information needed to determine whether the respondent experienced a crime victimization (rape or sexual assault, robbery, aggravated or simple assault, personal larceny, burglary, motor vehicle theft, or other household theft). When a respondent reports an eligible personal victimization, the NCVS-2 is then administered to collect detailed information about the crime incident. The NCVS-2 is administered for each incident the respondent reports. For each victimization incident, the NCVS-2 collects information about the offender (e.g., sex, race and Hispanic origin, age, and victim-offender relationship), characteristics of the crime (including time and place of occurrence, use of weapons, nature of injury, and economic consequences), whether the crime was reported to police, reasons the crime was or was not reported, and victim experiences with the criminal justice system. All NCVS forms and materials including the NCVS-500, NCVS-1, and NCVS-2 have been previously approved by OMB (OMB NO: 1121-0111).
Each interview period, the interviewer completes or updates the household composition component of the NCVS interview and asks the crime screener questions (NCVS-1) for each household member age 12 or older. The interviewer then completes a crime incident report (NCVS-2) for each reported crime incident identified in the crime screener. Once the NCVS interview is completed (i.e., nonvictims responded to all NCVS-1 screening questions or victims completed all necessary NCVS-2 incident reports), the interviewer administers the ITS questionnaire to persons age 16 or older.
The first contact with a household is by personal visit and subsequent contacts may be by telephone. For the second through seventh visits, interviews are done by telephone whenever possible. Approximately half of all interviews conducted each month are by telephone.
ITS data collection
The ITS is designed to produce national estimates of identity theft for the target population – all persons age 16 or older living in NCVS households.
The 2021 ITS instrument includes several sets of questions:
Section A contains screener questions that asks respondents if they had experienced one of six types of identity theft in their lifetime: (1) successful misuse of an existing bank account; (2) successful misuse of an existing credit card account; (3) successful misuse of an existing email or social media account; (4) successful misuse of any other type of an existing account; (5) successful misuse of personal information to open a new account; and (6) successful misuse of personal information for other fraudulent purposes such as filing a fraudulent tax return. For each type of identity theft reported, the respondent is asked if it had occurred in the past 12 months. If so, they are asked to give the month and year that the theft had most recently occurred. For respondents that report an eligible type of identity theft that occurred in the past 12 months, the ITS instrument goes to Section B and begins asking about details of the most recent incident of identity theft based on the month and date given for each type of past year identity theft. If they report eligible types of identity theft during their lifetime but no identity theft during the past 12 months, the instrument skips to Section G which is described below. If a respondent does not report any eligible type of identity theft in their lifetime, the instrument skips to Section H which is described below.
Section B contains asks about how the discovery of the most recent incident of identity theft. This includes when the victim first discovered the theft, how it was discovered and how long the offender was misusing the victim’s personal information prior to the crime being discovered.
Section C asks about contacting financial institutions, credit bureaus, law enforcement and other government agencies regarding the most recent incident of identity theft.
Section D asks about physiological and psychological distress that victims may have experienced due to the most recent incident of identity theft. It also includes any professional help that the victim may have received for their physiological and psychological distress.
Section E asks about the offender in the most recent incident of identity theft. This includes whether the victim knew the offender and the offender’s relationship to the victim.
Section F asks about the financial impact of identity theft. This includes the financial loss due to the most recent incident of identity theft as well as the total financial loss across all incidents of identity theft that occurred in the past 12 months. This section also asks about potential credit and financial problems the victims faced with the most recent incident of identity theft.
Section G asks respondents who reported any lifetime identity theft in Section A about any identity theft that occurred prior to the past 12 months. It also asks about any associated relationship, physiological, psychological, credit and financial problems that may have occurred during the past 12 months.
Section H asks about preventative behaviors that respondents may take to prevent identity theft.
Section I asks respondents about their experiences with data breaches.
The complete 2021 ITS instrument is included for review as Attachment 1. For details on testing of the instrument, see Section 4.
Every effort has been made to make the survey materials clear and straightforward. The ITS instrument has been designed to make collection of the data as concise and easy for the respondent as possible. The ITS questions have been cognitively tested to ensure that they are easily understood by most respondents.
Contact Strategy
Contact materials focus on the NCVS in general and do not specifically reference the ITS or other supplemental surveys. The Census Bureau mails notifications to households prior to data collection, interviewers contact households for the first time in-person, and interviewers conduct nonresponse follow-up. The Census Bureau mails an introductory letter explaining the NCVS to the household before the interviewer's visit or call. When they visit a household, the interviewers carry cards identifying them as Census Bureau employees. Potential respondents are assured that their answers will be held in confidence and are used for statistical purposes. For respondents who have questions about the NCVS, interviewers provide a brochure, and can also reference information in their Information Card Booklet that contains information such as uses of NCVS data and frequently asked questions and answers. After interviews are completed at each enumeration period, the Census Bureau mails thank you letters to the household. All forms and materials used for contact with the household have been previously approved by OMB (OMB NO: 1121-0111). In addition, after each ITS interview is completed, respondents are provided a brochure that contains information about actions to take if a respondent’s identity is stolen (Attachment 5).
The Census Bureau trains interviewers (see Interviewer Training) to obtain respondent cooperation and instructs them to make repeated attempts to contact respondents and complete all interviews. The interviewer obtains demographic characteristics of persons not interviewed for use in the adjustment for nonresponse. NCVS and ITS response rates are monitored on a monthly basis and compared to the previous month’s average to ensure their reasonableness.
As part of their job, interviewers are instructed to keep noninterviews, or nonresponse from a household or persons within a household, to a minimum. Household nonresponse occurs when an interviewer finds an eligible household but completes no interviews. Person nonresponse occurs when an interview is obtained from at least one household member, but an interview is not obtained from one or more other eligible persons in that household. Maintaining a high response rate involves the interviewer’s ability to enlist cooperation from all types of people and to contact households when people are most likely to be home. As part of their initial training, interviewers are exposed to ways in which they can persuade respondents to participate, as well as strategies to use to avoid refusals. Furthermore, the office staff makes every effort to help interviewers maintain high participation by suggesting ways to obtain an interview, and by making sure that sample units reported as noninterviews are in fact noninterviews. Also, survey procedures permit sending a letter to a reluctant respondent as soon as a new refusal is reported by the interviewer to encourage their participation and to reiterate the importance of the survey and their response.
Interviewer Training
Training for NCVS interviewers consists of classroom and on-the-job training. Initial training for interviewers consists of a full day pre-classroom self-study, 4-day classroom training, post-classroom self-study, and on-the-job observation and training. Initial training includes topics such as protecting respondent confidentiality, gaining respondent cooperation, answering respondent questions, proper survey administration, use of systems to collect and transmit survey data, NCVS concepts and definitions, and completing simulated practice NCVS interviews. The NCVS procedures and concepts taught in initial training are also regularly reinforced for experienced NCVS interviewers. This information is received via monthly written communications, ongoing feedback from observations of interviews by supervisors, and monthly performance and data quality feedback reports.
NCVS interviewers also receive specific training on the ITS including eligibility, the organization of the ITS interview, content of the survey questionnaire, addressing potential respondent questions, and internal check items that are in place to help the interviewer ensure that the respondent is being asked the appropriate questions and follow-ups are posed when clarification is needed. Interviewers receive a self-study training manual that they are required to read and they must complete a Final Review Exercise to verify their knowledge of the concepts presented in the self-study training manual. The ITS training materials are distributed to interviewers approximately a month before the supplement goes into the field.
Monitoring Interviewers
In addition to the above procedures used to ensure high participation rates, the Census Bureau implements additional performance measures for interviewers based on data quality standards. Interviewers are trained and assessed on administering the NCVS-1, NCVS-2, and ITS exactly as worded to ensure the uniformity of data collection, completing interviews in an appropriate amount of time (not rushing through them), and keeping item nonresponse and “don’t know” responses to a minimum. The Census Bureau also uses quality control methods to ensure that accurate data are collected. Interviewers are continually monitored by their Regional Office to assess whether performance and response rate standards are being met, and corrective action is taken to assist and discipline interviewers who are not meeting the standards.
Reinterview is a major feature of the program that the Census Bureau has implemented to assess data quality and the reporting of crimes in the NCVS. The NCVS reinterview uses two approaches: random and supplemental selection to validate interviewer performance. The random reinterview approach consists of selecting a sample of each interviewer’s work to review over the data collection cycle. The supplemental approach allows supervisors to identify additional interviewers or cases for review throughout the cycle. Reinterview requires that a supervisor or experienced interviewer re-contact respondents in a sample of previously-interviewed households. Reinterviewers verify that the original interviewer contacted the correct sample unit, determined the correct household composition, and classified noninterview households correctly. Reinterviewers also verify the household roster and tenure, ensure specific questions are covered, and re-ask a subset of the NCVS crime screener questions. Data from the reinterview approach is also used to assess the reporting of crimes in the NCVS, specifically missed crimes. The data are used to estimate household and person-level missed crimes.
Another component of the data quality program is monthly feedback. In 2011, the Census Bureau implemented a series of field performance and data quality indicators. Previously, high response rates were the primary measure of interviewer performance. The data quality indicators are tracked through the Census Bureau’s expanded Performance and Data Analysis (Giant PANDA) tool, and monthly reports provided to the field. Under the revised performance structure, interviewers are monitored on the following –
response rates (household, person, and the current supplement in the field);
time stamps (the time it takes to administer the screener questions on the NCVS-1 or the crime incident questions on the NCVS-2);
overnight starts (interviews conducted very late at night or very early in the morning);
late starts (cases not started until the 15th or later in the interview month);
absence of contact history records (cases missing records of contact attempts with the household and/or persons within the household)
Noncompliance with these indicators results in supervisor notification and follow-up with the interviewer. The follow-up activity may include simple points of clarification (e.g., the respondent works nights and is only available in the early morning for an interview), additional interviewer training, or removal of the interviewer from the survey.
Every effort has been made to make the survey materials clear and straightforward. The ITS instrument has been designed to make collection of the data as concise and easy for the respondent as possible. The ITS questions have been cognitively tested to ensure that they are easily understood by most respondents.
Nonresponse and Response Rates
Interviewers are able to obtain NCVS interviews with about 83% of household members in 71% of the occupied units in sample in a given month. The interviewers are trained to make repeated attempts at contacting respondents and to complete interviews with all eligible household members.
Annually, the Census Bureau conducts complete analyses of nonresponse. As was done for previous iterations of the ITS, the Census Bureau will report nonresponse and response rates, respondent and nonrespondent distribution estimates, and proxy nonresponse bias estimates for various subgroups for the 2021 ITS. Should the analyses reveal evidence of nonresponse bias, BJS will work with the Census Bureau to assess the impact to estimates and ways to adjust the weights accordingly. The interviewers obtain demographic characteristics of noninterview persons for use in the adjustment for nonresponse.
All survey questions in the 2021 ITS have been tested and are known to be easily understood and answered. Beginning in early 2019, BJS collaborated with the Census Bureau in updating the 2018 questionnaire in preparation for the next administration of the ITS. During this update, in addition to minor wording changes to several questions, text changes to the data breach questions, and the deletion of questions about online shopping and having knowledge of the ability to obtain a free credit report every year were proposed by BJS. In April 2019, the Census Bureau provided an expert review of the questionnaire through its Center for Behavioral Survey Methods. This expert review led to the approval of the changes proposed by BJS.
In June 2019, BJS initiated a series of meetings in which methodological concerns regarding the ITS were discussed. Among these concerns were that the incidents captured in each wave of the ITS are not bounded. Data analysis by BJS revealed that some respondents reported their most recent incident of identity theft occurring during the past 12 months, but also reported first discovering the same incident prior to past 12 months. This finding led to BJS concluding that some ITS respondents may have been telescoping incidents into the 1-year reference period of the ITS when the incident may have occurred prior the reference period.
As a result of these methodological concerns regarding telescoping and unbounded incidents, BJS contracted with RTI, Internationala through the National Victimization Statistical Support Program (NVSSP) in the fall of 2019 to conduct research to improve the ITS questionnaire. To allow time for this research, the ITS was taken off of its 2-year cycle and was not administered in 2020. This research consisted of four parts: (1) an analysis of state laws on identity theft; (2) secondary data analysis on previously collected ITS data; (3) cognitive interviewing; and (4) an online pilot test.
The analysis of identity theft laws in the 50 states and Washington, DC, was conducted to determine if the definition of identity theft in the 2018 ITS questionnaire reflects how identity theft is defined legally (Attachment 6). In this analysis, RTI, International examined similarities and variations in the legal elements of identity theft across the 50 states and Washington, DC. The key elements of the laws that were examined were (1) the definition of personally identifiable information (PII); (2) how PII is misused – whether the law focuses on just financial gain or nonfinancial uses as well; (3) the severity of punishments; and (4) the statute of limitations for charging identity theft offenders. This analysis was done first by identifying identity theft laws from the National Conference of State Legislature’s “Identity Theft” database, the Identity Theft and Credit Card Fraud Laws available on FindLaw’s website and the LexisNexis legal database. Next, these laws were analyzed to determine if a state explicitly mentioned and regulated the aforementioned key elements. The analysis determined that the definitions of identity theft used in the current ITS cover a wide range of activities that would fall under legal definitions of identity theft and that the data from the ITS could be modified to fit a specific state’s definition if necessary. Based on the analysis, no change was recommended to the BJS definition of identity theft currently operationalized in the ITS.
After the legal analysis was conducted, secondary data analysis of data from previous waves of the ITS data was done to examine three issues that affect the definition and prevalence of identity theft that were not examined in the analysis of state laws: (1) the reference point used for determining whether an incident is within the survey reference period; (2) the potential for respondents to telescope incidents into the reference period; and (3) the inclusion of attempted incidents in the definition of identity theft (Attachment 7). This quantitative analysis, including generating counts and percentages, and conducting significance tests, utilized data from the 2008, 2014, 2016, and 2018 ITS.
The results showed the following: (1) the majority of victims were able to provide dates for both the date of first discovery and the date of occurrence of the most recent incident of identity theft, two of the potential reference points provided by the ITS; (2) it was difficult to find conclusive evidence that respondents were telescoping identity theft incidents into the one-year reference period, but given the potential for telescoping on the core NCVS and the potential for respondent confusion regarding the different reference points in an identity theft incident, testing the use of dual reference periods would be very useful; and (3) respondents may have had difficulty separating incidents of attempted identity theft from completed incidents of identity theft.
These results led to several recommendations including (1) continue to use the most recent incident of identity theft as the reference point to determine whether an incident is in scope of the ITS; (2) ask for a date for the most recent incident of identity theft; (3) use a dual reference period to reduce the possibility of telescoping; and (4) ask respondents about the most recent successful incidents of identity theft only when responding to follow up questions regarding the details of incidents. These recommendations led to the creation of a draft of the screener questions that implemented these recommendations. These screener questions excluded attempted identity theft; contained a dual reference period for each type of identity theft (misuse of an existing bank account, misuse of an existing credit card account, misuse of another type of existing account other than a bank or credit card account, misuse of personal information to open a new account and the misuse of personal information for other fraudulent purposes such as giving false identifying information to law enforcement); and asked for the date of most recent incident of each type of identity theft experienced in the past year. Also recommended was cognitive testing to ensure that respondents were able understand all of the concepts presented in the drafted questions. These questions were subjected to the recommended cognitive testing which is discussed below.
The definition of identity theft in the 2018 ITS and in previous waves of the ITS included attempted incidents of identity theft. The draft of screener questions marked a change in the definition of identity theft used by BJS by excluding attempted identity theft. Even though the analysis of state identity theft laws suggested that the definition of identity theft used in the ITS remain unchanged, the secondary data analysis was able to examine the topic of attempted identity theft where the legal analysis could not. The secondary data analysis was able to provide additional information and show the difficulty that respondents had in identifying attempted identity theft incidents that helped BJS make the decision to exclude attempted identity theft from the draft of screener questions.
RTI, International conducted cognitive testing on the drafted questions to assess whether respondents had challenges with understanding the concepts and questions (Attachment 8).b The interviewing consisted of over two dozen adult cognitive interviews. Results revealed that there were many questions where none of the interviewees had difficulty understanding and answering them as intended. However, the testing did reveal that respondents had some difficulty understanding what to include in questions on the misuse of existing credit card and bank accounts, and understanding the instructions in the question asking for the month and year of the most recent incident of identity theft. Respondents also reported that examples of types online and entertainment accounts would be helpful in responding to questions about misuse of existing accounts other than bank and credit card accounts. Also, some of the respondents had difficulty classifying misuse of social media accounts since it is possible to misuse them without a financial transaction taking place. In the question about the month and year of first discovery of the most recent incident of identity theft, some of those interviewed had difficulty with the last sentence used.
The testing led to several recommendations including (1) some wording changes to the questions about the misuse of existing bank and credit cards would make the question easier for respondents to understand what to include; (2) reconsidering where to classify the misuse of social media accounts; (3) revise examples of online payment and entertainment accounts in the question about the misuse of existing accounts other than bank and credit card accounts; and (4) remove last sentence from question about month and year of first discovery of the most recent incident of identity theft to reduce confusion.
These results and recommendations were implemented on the draft screener questions and led to wording changes in some of the drafted questions (including the addition of instructions in questions about the misuse of existing bank and credit card accounts about what to include and exclude and the deletion of the instruction in the question about the month and year of the most recent incident of identity theft) and the creation of separate screener questions regarding the misuse of existing email and social media accounts.
Previous versions of the ITS did not include the misuse of existing email and social media accounts as a separate screener question, like the misuse of an existing bank account. This type of misuse was presented as an example in the question asking about misuse of existing accounts other than bank and credit card accounts. However, the cognitive interviewing revealed that the nature of email and social media accounts are different enough from other types of accounts listed in the question to warrant a separate question. This is because the types of accounts listed in the question about the misuse of other existing account focused on financial loss, which was less likely to occur with email and social media accounts than with the other types of accounts listed as examples in the misuse of other existing account question.
After the cognitive interviewing, RTI, International and the National Opinion Research Center (NORC) conducted an online pilot testc, testing three versions of questions that could be potentially used in the screener portion of the ITS questionnaire (Attachment 9).d Version 1 consisted of Section A and the first 3 questions of Section B from the 2018 ITS questionnaire. Version 2 consisted of the draft of questions based on the results of the secondary data analysis and the cognitive interviewing. Version 3 consisted of the same questions as Version 1, with the exception that attempted incidents were excluded. Version 3 was created to test the effectiveness of asking only about successful incidents of identity theft. This randomized test consisted of more than 31,000 adult respondents, each of whom completed one of the three versions of the questions.
The ITS online testing experiment was designed to isolate the impact of different changes to the instrument on the prevalence of identity theft. For example, the Version 1 and Version 3 instruments were exactly the same, except for the exclusion of attempted incidents. Since the two versions were administered in the same mode and randomized across sample members, comparison of the prevalence rates would show the extent to which attempts are inflating the Version 1 prevalence. The potential for respondents to engage in telescoping would not impact this assessment since any telescoping would be consistent across the two versions.
In contrast, Version 2 was designed to better control for telescoping than Versions 1 and 3 through the use of questions about lifetime experiences with ID theft and additional dating questions. The inherent assumption was that if Version 2 controls better for telescoping, the prevalence rate will be lower than for Versions 1 and 3. Therefore, prevalence was an important indicator of performance. However, the analysis of the testing results went well beyond a simple comparison of prevalence rates, to also including a detailed assessment of the different dates provided by respondents in Version 2, to ensure that the information provided was logical and demonstrated the respondents’ ability to place different aspects of an ID theft episode in time. Additionally, data quality measures, such as item missingness and out-of-reference period dates, were examined across all three instrument versions and considered in the final determination.
The test results revealed that, among the three versions tested, Version 2 resulted in the lowest prevalence of identity theft and appeared to best control for telescoping. Respondents appeared to understand the distinctions in the dating questions and the majority were able to identify the month and year of the most recent occurrence of each type of identity theft reported in the past year. Based on the findings from the online pilot test, Version 2 was recommended for the 2021 ITS questionnaire. It was understood that selecting Version 2 would result in a break in series to the previous waves of the ITS due to the change in the definition of identity theft, with the exclusion of attempted identity theft and the inclusion of the misuse of existing email and social media accounts as a separate screener question.
Section A and part of Section B were replaced with Version 2 from the online test. This change caused the remainder of the questionnaire to be revised, primarily to remove references to attempted identity theft. After the revision, RTI, International conducted two rounds of expert review of the entire revised ITS questionnaire. These expert reviews noted that Section G (Long-Term Victimization and Consequences) needed to be changed. Previously, the questions in Section G was asked of all respondents and contained questions about identity theft that occurred prior to the past year, and problems that may have stemmed from the theft. Since questions regarding lifetime identity theft were asked in the screener questionnaire, it was not necessary to ask every respondent about identity theft prior to the past year.
After RTI’s second expert review, the Census Bureau also conducted several rounds of review that led to the deletion of questions that were not used by BJS and non-substantive wording changes to questions in Section G.
BJS takes responsibility for the overall design and management of the activities described in this submission, including developing study protocols, sampling procedures, questionnaires, and overseeing the conduct of the studies and analysis of the data by contractors.
BJS contacts include –
Erika Harrell
Statistician
202-307-0758
The Census Bureau is responsible for the collection of all data. Ms. Meagan Meuchel is the NCVS Survey Director and manages and coordinates the NCVS and ITS.
The Census Bureau contacts include –
Meagan Meuchel
NCVS Survey Director
Associate Director for Demographic Programs – Survey Operations
301-763-6593
Megan Ruhnke
NCVS Assistant Survey Director
Associate Director for Demographic Programs – Survey Operations
301-763-9842
C. Attachments
Attachment 1: 2021 ITS questionnaire
Attachment 2: Title 34, United States Code, Section 10132 of the Justice Systems Improvement Act of 1979
Attachment 3: 60-day notice
Attachment 4: 30-day notice
Attachment 5: “Identity Theft: What to know, What to Do” brochure
Attachment 6: Assessment of State Identity Theft Laws
Attachment 7: Identity Theft Supplement Secondary Data Analysis, Recommendations, and Next Steps
Attachment 8: Cognitive Interviewing for the National Crime Victimization Survey (NCVS) Identity Theft Supplement (ITS)
Attachment 9: Identity Theft Screener Online Testing: Final Report
a For the online pilot test, RTI, International entered into a subcontract with the National Opinion Research Center (NORC) to help carry out the test as explained below.
b Research previously approved by OMB under the BJS Generic Clearance Agreement (OMB Number 1121-0339).
c About 4% of participants completed the questionnaire by phone instead of online.
d Research previously approved by OMB under the BJS Generic Clearance Agreement (OMB Number 1121-0339).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Harrell, Erika |
File Modified | 0000-00-00 |
File Created | 2021-04-28 |