Questions and Responses to OMB Pass Back

BJS Response to OMB pass back comments_FIST Final.docx

Firearms Inquiry Statistics (FIST) Program

Questions and Responses to OMB Pass Back

OMB: 1121-0314

Document [docx]
Download: docx | pdf


  1. Please provide advance letters, nonresponse follow up letters, phone scripts and any other supplemental materials, as the only letters provided are cover letters to the questionnaires.

The following draft correspondences are attached for OMB review:

  1. Draft pre-notice letter

  2. Draft Cover Letter Sent to State Agencies that Publish Data Online

  3. Draft Cover Letter Sent to State Agencies that Do Not Publish Data Online

  4. Draft Cover Letter Sent to Local Agencies

  5. Draft Thank You Note

  6. Draft Phone Script


These documents are described in additional detail in responses #s 10 and 11.


  1. Who is the data collector for this survey?

The current data contractor for this survey is Regional Justice Information Services (REJIS), a quasi-government entity founded in 1976. REJIS and BJS entered into a cooperative agreement to conduct the FIST data collection in 1994 and REJIS has administered the program since its inception. REJIS was created to provide information technology products and services to criminal justice and government agencies. The agency’s services include, among others, the development of customer applications and web designs, online and batch processing, installation, and support of custom and commercially available software, systems integration, custom interfaces, networking, and training. REJIS has demonstrated expertise in data collection, data coding, entry and verification, and the production of public-use data files. REJIS has retained the services of a statistical consultant to oversee the survey methodology and sample design.


The FY 2011 FIST solicitation was released in December 2010, and applications were due on January 24, 2011. The peer review process has started and the FY 2011 data collection agent is expected to be selected within the next few weeks.


  1. Is BJS engaging a survey organization or a marketing organization to assist in the collection?  It is not clear from part b.

A marketing agency had been employed in past years to assist in outreach and follow-up efforts, but these services have not been utilized in recent collections due to lack of resources. BJS will not utilize the services of a marketing organization to assist in the FIST collection.


  1. Is BJS planning to return to OMB in FY 12 with a revised data collection plan that includes web? 

The FY 2011 solicitation outlines as an objective the creation of an electronic or Web-based data collection form for use by all respondents as part of a multi-mode collection effort. BJS anticipates that this form will be implemented during the FY 2012 data collection and will supplement current data collection efforts. BJS and the data collection agent will promote the use of the form to submit responses electronically but will continue to accept responses via each agency’s preferred mode of transmission in an effort to increase the response rate and decrease the respondent burden. BJS will submit the web-based form for update OMB review and approval prior to implementationafter the data collection form has been created and advise OMB of its implementation plan.





  1. Please change the words “confidentiality guarantee” to “confidentiality assurance” or the like in SS A10 and in the cover letters.  Under no circumstances should anagency “guarantee” confidentiality and we would like BJS to avoid that language.

BJS has made this change to the documents. The documents are attached for review (Attachments II, III, IV, and VII).


  1. In SS B1, the sample size is essentially justified as “this is what we did last time.”  BJS should provide a precision analysis that justifies the sample size and stratification approach in terms of acceptable sampling error around key estimates.


In 2009 there were 39 states that provided complete state data for this study. Of the remaining 11 states, data had to be collected via a sample of local agencies in order to estimate the number of applications, rejections, and reasons for rejection.


In 1996, when the data collection was initiated, a 4% margin of error was assigned to the sampling process for local agencies in the 30 states that did not have statewide data collection. The methodology that was employed at the beginning of the project was “sampling with probabilities proportional to size;”¹ this methodology remains in place currently. The size categories were determined based on the most recent U.S. Census population data: A) local agencies that covered localities with less than 10,000 population; B) local agencies that covered localities with between 10,000 and 100,000 population; and C) local agencies that covered localities with greater than 100,000 population.


For these 11 states in 2009, the following distribution of local agencies charged with collecting the application and permit information was determined. This constitutes the population of all such local agencies in the 11 states and was used as the initial weighting system.i


A

B

C

Totals

1,391

919

106

2,416

58%

38%

4%

100%



In 2009, using the sample size formula provided by Scheaffer, Mendenhall & Ott (pg. 74)ii, the required sample size for a 4% margin of error was determined using the following formula:



Where:

n = sample size

N = population size

p = estimated proportion in population (set at .5 – most conservative)

D = B² / 4

B = Bound on error (set at ±4%)



Using the values in the formula yields the following:



Thus an overall sample size of 497 local agencies in the 11 states should yield an estimate within the 4% margin of error and could be combined with the 39 states with complete data to produce estimates for the United States. The 2009 sample size was 667. The resulting standard error (measure of precision) was ±.018 or about 2%. This is less than half the margin of error set for the sample size determination (4%).


Confidence intervals (set at the 95% level) and standard errors are computed each year for the national and local estimates to measure the standard deviation of the sampling distribution (Attachment VIII).


BJS intends to utilize the current sample in 2010. A stated objective outlined in the FY 2011 FIST solicitation is the development of a sampling plan for the collection of 2011 application and denial data from local checking agencies to ensure that reliable national estimates can be developed. In 2011, BJS intends to redraw the sample of local agencies and will provide OMB with the design specifications prior to implementation.


  1. Is the sample redrawn each year?


Numerous changes have been made to the Brady policies and procedures since the FIST program began that resulted in significant changes to the background check process. The following description briefly summarizes the major changes and provides context to explain how the current sample was identified.


Initial sample

When the sample was first established, there were 20 states operating with one single checking agency responsible for gun regulation (known as Chief Law Enforcement Officer, or CLEO) and 30 states operated multiple CLEOs. There were a total of 5,477 CLEOs in these 30 states. BJS and the data collection agent selected a random sample from among these 30 states to develop the national estimates; the sample was stratified based on the most recent Census population data (population over 100,000; population between 10,000 and 100,000; and population less than 10,000). These population categories were selected to be consistent with those used by the FBI in similar studies. The overall sample size of 561 was selected using a margin of error of +/- 4%. The 20 states with a single CLEO plus the random sample from the CLEOs in the other 30 states covered approximately 65% of the U.S. population.


Changes to the sample

A change that had a major effect on the agencies sampled occurred between 1998 and 2005 when many states transferred background check responsibilities to the FBI. As of 2009, the FBI assumed responsibility for all checks for 29 states that cover 46% of the entire U.S. population. This shifted the number of states with multiple checking agencies processing applications into the group with a single, statewide data collection system. In 16 states, 30 checking agencies have checking responsibilities representing 44% of the U.S. population. All 30 agencies are included in the data collection and responded to the survey. Local agencies conduct background checks for handgun permits in 5 states covering 10% of the population. Local agencies operate to conduct special types of permit checks (exempt carry permits) in six additional states that are either under the responsibility of the FBI or operate a central statewide checking agency; data collected from these agencies are included in the sample. Data on background check activities collected from the FBI and the state agencies are true counts that are combined with the local agency estimates to produce the national estimates.


Current sample

In 2009 data were collected from 11 states that operated local background checking agencies. As discussed in response # 6, the sample was stratified by population and census data was used to calculate the relative weights of samples from local agencies. National estimates are produced using the counts provided by the FBI, state agencies, and respondent local agencies by using weighting factors derived from the original stratification, a process automated with computer programs developed by the data collection agent and regularly updated to reflect changes in the background check system.


The sample of local agencies in these 11 states is not redrawn each year. The local sample has a small impact on the overall national totals. Complete data is available for 90% of the U.S. population and the remaining 10% is derived from the local estimates. After weighting the response rate to adjust for the state agencies’ contribution to the entire population, the readjusted response rate is approximately 96%. However, in order to ensure that an optimal research design is achieved BJS intends to redraw the sample for the 2011 data collection.


In revising the sample, BJS will examine the sample size and expected precision relative to improved response rates expected through the addition of a new collection mode and enhanced non-response procedures.


  1. Please describe frame building and maintenance activities, as well as what is known about frame completeness and currency.


When the FIST program began in 1996, the data collection agent’s first objective was to develop a master file of all reporting agencies across the country authorized to conduct firearm background checks. This list has been continuously updated as changes in Brady policies and procedures impacted how background checks are conducted in each state. The following description provides context of how the data collection agent developed the original master file of reporting agencies and currently maintains it to ensure frame completeness and currency.


Initial frame building activities

At the beginning of the FIST program, the Interim Provisions of the Brady Act required that the Chief Law Enforcement Officer (CLEO) in each locality of a state assume the responsibility of accepting purchase applications and performing background checks to determine if the individual was qualified to purchase a firearm. Each state determined the CLEOs in its state. CLEOs could be local police departments, sheriffs, or a state agency that processed firearm applications. In order to identify the CLEO in each state, it was necessary to contact each state individually and determine the entity legally responsible for performing the check.


The data collection agent contacted all of the states and obtained a detailed description of how the firearm checking process worked in each state. This information included the description of all agencies authorized to perform the check. In some cases the state law identified a specific agency by name and in other cases it identified an agency by category such as all sheriffs’ offices in the state.


The data collection agent then had to identify the specific agencies that performed the check if the state law did not list it by name. At the beginning of the project this was done comparing state laws against a master list built by the data collection agent, which was created from a list maintained by the FBI that identified the total number of legitimate criminal justice agencies by state. This list represented all of the agencies that had the authority to perform background checks and included contact information for each agency. The data collection agent then matched the list of agencies against the legal definitions in each state to determine which agencies were legally responsible for performing the checks. From this process the universe file was created.

Past frame maintenance activities

Changes to the provisions of the Brady Act affected the survey procedures and the list of agencies conducting background checks changed significantly in 1998 when the FBI assumed responsibility for background checks for all states unless a state specifically chose to perform the checks and fulfilled the FBI criteria. This resulted in many states opting out of background check activity entirely and also decreased the number of local agencies legally authorized to conduct the checks. The data collection agent dedicated significant time and resources to stay informed of changes to state laws and background check processes in order to maintain an accurate list of agencies that had responsibility for this area.


From 1996 to 2005, the data collection agent was responsible for gathering information about state laws and policies for the Survey of State Procedures Related to Firearm Sales (SPS), a BJS publication series that provides an overview of firearm check procedures in each state. Prior to publication each year, the data collection agent contacted each state and requested that it review and update its information as necessary. As a result, the data collection agent is made aware of any changes to state laws related to its firearm background processes during the years the series was published.


Current frame maintenance activities

After the SPS series was discontinued in 2005, the data collection agent continued to research state laws online to determine if changes to background check processes had been implemented. Additionally, the data collection agent maintained contact throughout the project period with both the FBI and ATF, the agencies that authorize agencies to conduct background checks for firearm sales.


Finally, the data collection agent contacts agencies from each state annually and requests updates to their contact information. The agent also takes this opportunity to inquire about the status of any applicable changes to background check procedures.


Since the permanent provisions of the Brady Act have been enacted, there have not been a significant number of changes to the frame from year to year. Given the various methods employed to assess changes to state firearm laws, including a strong collaboration with the FBI and ATF, BJS is confident that adequate measures are in place to maintain an accurate file of state and local agencies authorized to conduct background checks.


  1. In SS B2, given agency levels of automation, why is BJS using fax machines to deliver the survey rather than at least email? 

Respondents are given the option to submit data via fax, email, or mail. Historically, the majority of agencies have elected to return the survey using fax because this has been their most convenient mode of transmission. Many of the smaller local agencies have gained access to the internet and email in recent years and have not yet developed a strong comfort level with submitting responses electronically.


The data collection agent will take all possible steps to make the submission process as easy as possible for the respondent in order to reduce the burden and will accept responses however the agency prefers to send them. The FY 2011 FIST solicitation outlines as an objective the creation of a web-based collection form for use by all respondents as part of a multi-mode collection system. BJS and the data collection agent will encourage the respondent to submit responses electronically, but will also continue to accept responses by their preferred method.


  1. BJS should provide the specific data collection strategy, including any use of advance notification, nonresponse follow up at prespecified intervals and modes, etc.  Someone should essentially be able to replicate your data collection by what they read in the supporting statement and what is currently there is inadequate to understand the procedures planned.


Data collection strategy in 2011 (for 2010 data)

April to May – The data collection agent will research and update its comprehensive list of state contacts, and will research and confirm any changes in the background check responsibilities of respondent agencies to determine the population universe. The data collection agent will also collaborate with the FBI and ATF to receive relevant data on firearm background check activity and post-denial activities.


June

Week 1: A prenotice letter will be sent via mail, email, or fax based on the contact information available for the agency and the agency’s preferred mode of communication.


Week 2: A survey will be sent via mail, email, or fax based on the agency’s preferred mode of communication. A detailed cover letter will be included to explain the importance of a response and indicate alternate submission modes. A requested response deadline is provided (2 weeks after the survey is sent).


Weeks 3- 4: A thank you note will be sent via mail, email, or fax to express appreciation for responding to the survey, even if the survey has not yet been returned.


The data collection agent will enter data into the project databases as it is received and will continue to review state websites and FBI reports to extract published data. The data collection agent will update contact information for agencies as needed.



July

Week 2: A replacement survey will be sent via mail, email, or fax to agencies that have not yet responded. This will be sent 2-4 weeks after the initial survey is sent.


Week 3: A follow up phone call will be made to agencies who have not responded to the survey (if a telephone number is provided). Otherwise, a request will be made via mail, email, or fax.


The data collection agent will enter data into the project databases as it is received and will continue to review state websites and FBI reports to extract published data. Data verification efforts continue. The data collection agent will update contact information for agencies as needed.


August

Weeks 1-2: The data collection agent will make one final attempt to reach the reporting agency. The mode of outreach will vary depending on the history of past attempts made.


Week 4: Data entry concludes.


The data collection agent will enter data into the project databases as it is received and will continue to review state websites and FBI reports to extract published data. Data verification efforts continue. The data collection agent will update contact information for agencies as needed. Data entry will conclude at the end of August


September to October

Work to produce the estimates will begin. Data processing and analysis continue. Data verification will continue as needed.


November

Final reports and statistical tables will be completed and submitted to BJS for review. Efforts to maintain the master contact list and research state laws for changes in firearm background check procedures continue.


Throughout the data collection process, the data collection agent will maintain a comprehensive record of all follow up and reporting activity and log details of when data is received, from whom, by what means (fax, email, etc.) and applicable changes in address and other contact information. This will be done to ensure that duplicate requests are not made to agencies and that the agency’s preferred mode of submission is noted for subsequent years. Five attempted contacts will be made to each agency before it is considered to be nonresponsive. Specific dates will vary annually depending on holiday and staff schedules. The data collection agent will vary the modes of outreach so the reporting agency receives at least one phone call, one email (if an email address is available), and one fax or letter request.


December to March

BJS and the data collection agent will continue to evaluate the FIST program design and sampling plan. The sample will be redrawn to prepare for the 2011 data collection.


The data collection schedule may be revised and adjusted to accommodate data collection activities based on the selection of the data collection agent for FY 2011.


  1. Based on the survey literature, what specifically will BJS and its data collection agent do this year that is expected to increase response rates from 60% to 80%?


Although the sampled local agencies produced an overall response rate of 65%, the adjusted response rate is approximately 96% after weighting responses by population covered. As part of the new cooperative agreement, BJS will enhance its outreach and follow up efforts in order to increase the response rate of the local reporting agencies as well as to continue to maintain its current 100% response rate for the state reporting agencies. BJS and the data collection agent will model its contact methodology on the tenants outlined in Don Dillman’s work on mail and internet surveys (Dillman, 2000).iii The FIST data collection has consistently employed two of Dillman’s suggested elements: a respondent-friendly questionnaire and the inclusion of self-addressed stamped envelopes when mailing the survey. However, in order to increase the response rate of local reporting agencies to 80%, BJS will take additional steps to better engage respondents.


The specific steps BJS and the data collection agent will include increasing the number of contacts from three to five (as suggested by Dillman) and will include: a brief prenotice letter sent via fax, email, or mail to respondents a few days prior to the survey (Attachment I); the applicable cover letter (Attachments II, III, and IV) included with the survey; a thank you note, either in the form of postcard or email (Attachment V); a replacement survey and applicable cover letter sent to nonrespondents 2-4 weeks after the initial survey mailing; and a final contact made by phone within two weeks after the fourth contact has been attempted (Attachment VI). As identified by Dillman, this type of contact sequence draws on different ways of engaging the respondent and conveys a difference message and expectation.


In addition to following a more systematic follow-up schedule, BJS and the data collection agent will continue to tailor its correspondences in order to convey a sense of personal outreach. The data collection agent has made efforts in the past to consistently personalize letters and this practice will continue in future collections. Additionally, BJS will send the initial contact on official BJS letterhead in order to convey BJS responsibility.


BJS will also notify each respondent that the final publication is available on the BJS website to evoke a sense of ownership in helping to create the final product.


In addition to enhancing its follow-up practice, BJS and the data collection agent will continue to employ multi-modal submission options to decrease the respondent burden. A stated objective of the FY 2011 FIST solicitation is the development of an electronic or Web-based data collection form for use by all respondents as part of a multi-mode collection system. While electronic data submission will be encouraged, the data collection agent will continue to be flexible in how it accepts data and will continue to track each agency’s preferred mode of transmission in order to tailor its outreach efforts.


The peer review process for the FY 2011 FIST solicitation is currently underway. One of the stated objectives of the solicitation is to create a data collection and follow up plan that achieves an 80% response rate for the sample of local agencies. BJS will evaluate the proposed plans and consider implementing other ideas not listed above in attempt to increase the overall response rate. While increasing the response rate is a priority issue for the FIST program, the modes and levels of follow-up efforts are contingent upon the availability and level of funding resources.


  1. What analysis from the frame or other sources can BJS do of nonresponse bias particularly for the smaller jurisdictions?  As BJS knows, OMB standards require a prespecified nonresponse bias analysis plan for surveys where the expected response rate (based primarily on last administration) is anticipated to be below 80%.


In order to factor in potential non-response, the sample was increased by 34% in an attempt to reach the needed sample size. It was also determined that there should be at least a minimum of 5 local agencies in each category for each of the 11 states, where this was possible. The breakdown of this oversampling resulted in the following distribution of local agencies by category in the sample:iv


A

B

C

Totals

307

314

46

667

46%

47%

7%

100%


The following is the actual CLEO response to the data collection by category. The overall response rate was 70%, which is in an acceptable range and the distribution of local agencies by category follows the sample distribution (Attachment X).


A

B

C

Totals

207

229

28

464

45%

49%

6%

100%


An indicator of the reliability of the sampled local agencies is the overall variability in the data. Across all categories of states and groups of local agencies, the variability in the rejection rate is low and relatively consistent. This would indicate that the remaining local agencies that were not sampled are highly unlikely to produce application rejection rates that exhibit extremely high (or low) variability rates along with extremely high or low counts relative to their population size.


BJS recognizes that nonresponse bias is a concern for surveys with an anticipated response rate of less than 80%. However, the overall response rate is approximately 96% after the response rate is weighted to reflect the proportion of the population for which true counts are available. Coupled with its efforts to increase the response rate of the local agencies to 80%, BJS will work with the selected FY 2011 data collection agent to determine additional methods to assess the existence of potential bias due to unit nonresponse.


In FY 2011, BJS intends to conduct a nonresponse bias assessment at the state-level, if the response rate within any states falls below 80%. There are several methods that can be employed to conduct this analysis: one option is to compare respondents and nonrespondents across subgroups using available sample frame characteristics, which will provide additional information about the presence of nonresponse bias; another option is to employ formal multivariate modeling to compare the proportional distribution of characteristics of respondents and nonrespondents to determine the presence and magnitude of nonresponse bias; a third option would be to compare the characteristics the early respondents to those who respond later (after more attempts) to estimate the characteristics of the remaining nonrespondents. In collaboration with the selected data collection agent, BJS will assess the advantages of each approach in relation to the level of required resources to determine which method is most appropriate.


Additionally, the data collection agent will continue to employ methods to reduce the potential for nonresponse bias by utilizing methods to increase the response rate, including a more concerted follow up plan and a multi-mode collection system as addressed in response #11.


  1. What is the referenced “safety” issue with web and other technologies given that many other statistical agencies have been securely using the web for years and that there is no sensitive or confidential data in this collection?

BJS has no concerns associated with the safety or security of using a secure-web data collection process and a stated objective in the FY 2011 FIST solicitation is the creation of an electronic or Web-based data collection form for use by all respondents as part of a multi-mode collection system.


In the past, some local agencies have voiced concerns about the security of sending the requested information electronically which is the basis of the referenced safety concern. However, as more local agencies are gaining both access to and familiarity with the use of the web and other technologies, this is not as significant of concern. Upon the completion of the secure web data collection instrument, the data collection agent will promote the use of electronic response submissions and be available to provide technical assistance as needed.


  1. Ineligibles.

    1. What was the ineligible rate from the last two survey administrations? 


For this data collection, the only factor that would result in an agency’s ineligibility to participate in the survey is when it is not longer authorized to conduct background checks for firearm purchases. These agencies are removed from the universe file and sample prior to the implementation of the data collection.


    1. Why are those cases included in the nonresponse rate if they are ineligible? 


Agencies that are ineligible to participate in the survey due to changes in their background check reporting functions are removed from the sample and are not included in the nonresponse rate. The nonresponse rate is composed only of eligible agencies that have elected to not participate in the survey.


    1. What frame maintenance activities is BJS doing to address this issue? 


Please see response #8 for a detailed description of the frame maintenance activities BJS and the data collection agent conduct to ensure that the sample and frame are composed of only those agencies authorized to conduct background checks.


    1. If ineligibles are an ongoing problem, why doesn’t BJS include a screener question at the start of the questionnaire?


Due to the number of measures in place to identify changes to state laws related to background check procedures, concern about ineligibles is not applicable for this data collection.




Attachments

  1. Pre-notice letter

  2. Survey cover letter to state agencies that submit monthly reports to FIST and/or publish statistics online

  3. Survey cover letter to state agencies that do not submit monthly reports to FIST and/or publish statistics online

  4. Survey cover letter to local agencies

  5. Thank you note

  6. Phone script

  7. Revised Supporting Statement A10

  8. Estimated standard errors, 2009 FIST

  9. Local agency sample breakdown by size


Contact information:

Allina Boutilier, BJS

202-305-2696

Allina.boutilier@usdoj.gov

i The 2009 FIST statistical tables include New Jersey in the population of local agencies sampled because the state is surveyed to collect data on reasons for rejection which are integrated into the national estimate. New Jersey state law requires that local agencies report counts of firearm background checks which results in the collection of complete data on applications and rejections for the state. For the purpose of this exercise to assess the reliability of estimates in relation to the current sample size and response rate, New Jersey was removed from the sample in order to more accurately assess and justify the sample size.

ii Scheaffer, R.L., Mendenhall, W. and Ott, L. (1990). Elementary Survey Sampling, 4th edition. Boston:PWS Kent. Chapter 4, pp 76-82.

iii Dillman, Don. (2000). Mail and Internet Surveys: The Tailored Design Method. John Wiley & Sons: New York.

iv See endnote i for additional discussion of methodology.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAllina Boutilier
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy