Download:
pdf |
pdfOSHA Data Initiative Collection Quality Control:
Analysis of Audits on CY 2006 Employer
Injury and Illness Recordkeeping
Task Order No. 3
Base Year
Contract No. J-099-F-2-8441
FINAL REPORT
November 25, 2009
Prepared for:
Office of Statistical Analysis
Occupational Safety and Health Administration
Washington, DC
Prepared by:
ERG
Lexington, MA
&
National Opinion Research Center
Chicago, IL
CONTENTS
Page
EXECUTIVE SUMMARY ...............................................................................
ES-1
INTRODUCTION ............................................................................................
1
AUDITS OBJECTIVE ......................................................................................
1
AUDIT METHODOLOGY AND ANALYTICAL APPROACH.......................
2
State Plan State Participation .................................................................
Sampling Universe..................................................................................
Sample Selection of Establishments........................................................
Audit Protocol and Sampling of Employees within Establishments ......
Analysis ...................................................................................................
2
2
3
4
5
Findings ...............................................................................................................
11
Universe Estimates for CY 2006 Recordkeeping.........….......................
Case Analysis .........................................................................................
Submission Comparison Analysis ....................................................…...
12
18
22
SUMMARY AND RECOMMENDATIONS ....................................................
APPENDICES
A. List of OSHA Data Initiative Collection Quality Reports and Related Studies
B. Background on the OSHA Injury and Illness Recordkeeping Audit Program
C. OSHA Instruction: Audit and Verification Program of Occupational Injury and
Illness Records
D. Tracking Status Codes Used in Processing CY 2006 ODI Submissions
Final Report, November 2009
30
EXECUTIVE SUMMARY
This report presents findings on the analysis of audits on calendar year (CY) 2006
employer injury/illness recordkeeping. It is the eleventh audit program analysis.
Background
In 1995, the Occupational Safety and Health Administration (OSHA) established its Data
Initiative Collection System (ODI) to gather and compile occupational injury and acute illness
information from some 80,000 establishments in high-hazard industries. At the same time, the
Agency developed mechanisms to ensure the accuracy of the collected ODI data for OSHA’s
use—particularly in combination with other data sources—for targeting enforcement and
compliance assistance interventions. OSHA’s ongoing data quality efforts address both the data
collection process and the source records (i.e., employer recordkeeping on the OSHA 300 Log)
as an integral part of the ODI.
OSHA established the audit program with its onsite audits of employer injury and illness
records to annually assess and monitor the quality of employer injury/illness recordkeeping
nationwide.* The audit program has focused only on non-construction establishments, with the
exception of the sixth year of the program when OSHA conducted a pilot of the audit
methodology in a sample of construction establishments. Budget constraints have precluded
implementation of the audit program in construction establishments.
OSHA considers onsite audits of employer injury and illness records a key method of
verifying the accuracy of data submitted for the ODI and for estimating the extent of employer
compliance with OSHA recordkeeping requirements defined in 29 CFR 1904. In order to
implement this quality control component, OSHA developed a protocol for reviewing a sample
of employee injury/illness records within a sample of establishments as well as software to
streamline a process that was otherwise too resource intensive for widespread use.
Objective
The primary objective for OSHA in the eleventh year of the audit program was to
estimate CY 2006 employer injury/illness recordkeeping accuracy nationwide based on OSHA
recordkeeping audits conducted according to an established protocol at a sample of nonconstruction establishments drawn from the standard ODI universe.
Audit Methodology and Analytical Approach
OSHA implemented the audit program by selecting a sample of audit establishments
from a standard ODI universe. Each year OSHA compiles the standard ODI universe using a file
from Dun & Bradstreet that provides the most currently available industry, employment, and
*
This report represents the reporting-year analysis of a three-year analysis cycle that includes two interim-year
summary analyses followed by a comprehensive report for submission to the Office of Management and Budget
(OMB).
Final Report, November 2009
ES-1
location information on establishments. OSHA defines a standard ODI universe to be able to
generalize the annual estimates of overall accuracy for employer injury and illness recordkeeping
to ODI establishments nationwide and to facilitate year-to-year comparisons.
For this year of the program, OSHA again selected establishments from a universe that
covered industries included in all years of the ODI. More specifically, OSHA used a standard
ODI universe that included approximately 117,000 establishments nationwide that met the
following criteria:
•
Establishment is located in one of the States participating in the ODI (i.e., either in
the Federal OSHA jurisdiction or in one of the participating State Plan States).
•
Establishment has total employment of 40 or more.
•
Establishment is in one of the Standard Industrial Classification (SIC) codes selected
for any of the annual ODI collections.
To select a sample of audit establishments from a standard ODI universe and to increase
the likelihood of having 250 completed audits available for the analysis, OSHA implemented the
following steps:
Step 1: Draw an initial sample of 399 establishments from the standard ODI universe of
117,306 establishments. Before making this initial selection, OSHA sorted establishments
in the sampling frame by industry code, region, and employment size, resulting in an
implicit stratification. OSHA then drew the sample of establishments using a systematic
selection procedure.
Step 2: Include all establishments selected for the initial sample in the ODI universe for
the CY 2006 collection year.
Step 3: At completion of the ODI data collection cycle for CY 2006, eliminate from the
sample any establishments that did not meet audit program requirements (e.g., because
establishment was not located in a State Plan State that had chosen to participate in the
audit program or the establishment’s ODI submission for CY 2006 was not OK-verified).
Step 4: Assign the remaining sample establishments for an audit.
Step 5: Eliminate any completed audits that diverged from audit procedures in the
protocol.
As in other years of the audit program, OSHA committed to conducting 250 audits.
Previous analyses have established that selecting and assigning a sample of exactly 250 audits at
the outset is unlikely to yield the optimum number of completed audits for the analysis. A
shortfall can result because in some instances audits are not conducted due to constraints on
resources.
Final Report, November 2009
ES-2
The target sample size is based on a National Opinion Research Center (NORC)
determination that this approximate number of audits would provide an acceptable level of
power for detecting overall accuracy of employer recordkeeping at-or-above a 95 percent
threshold. This also would enable OSHA to provide reasonable estimates of accuracy for the
universe of establishments. As established for the previous audit program analyses, at lower level
break-outs, such as at the industry level, universe estimates would be considered unstable
because of the relatively small number of establishments that might occur in the subcategories of
the sample. (See National Opinion Research Center, Final Report: Sample Design for a
Statistically Valid Evaluation of Accuracy and Completeness of an Establishment’s OSHAMandated Employee Records, 1996.)
OSHA implemented the same general approach for analyzing the results of the
establishment audits as was used in past years of the program. The analysis approach addressed
two general areas:
Methodology for Implementing the Audit Cycle
C
Reviewing the documentation on the audits for completeness and adherence to the
established protocol.
C
Comparing the characteristics of the sample of establishments audited to those of
establishments in the standard ODI universe.
Results Related to the Accuracy of Employer Injury/Illness Recordkeeping
C
Calculating universe estimates of the overall accuracy of employer injury and illness
recordkeeping based on the results of the audits and the sample design.
C
Comparing recordkeeping accuracy estimates from the eleventh-year audit program
with results from the tenth year.
C
Performing a case-level analysis that describes the types of recordable cases the
auditors identified in the sample and details the recording errors they discovered.
C
Comparing the employers’ Log Summary and employment and hours worked data at
the establishment at the time of the audit with the data submitted to OSHA in
response to the CY 2006 ODI collection request.
Three principal size group categories based on average employment were used—“all
small” (40-99 employees), medium (100-249 employees), and large ($250 employees). Also, as
with the past eight audit program analyses, a small establishments subcategory of 40-49
employees was used to continue to assess any effect of the inclusion of smaller establishments in
the ODI.
Final Report, November 2009
ES-3
The universe estimate analysis focused on the types of recording errors that affect an
employer’s injury and illness rate, including:†
C
Underrecording of total recordable cases—The employer does not record an injury or
illness that should have been entered on the Log.
C
Underrecording or misrecording of DART cases (days away from work, restriction,
or transfer injury/illness cases)—Either the case is not recorded on the Log or the case
is recorded as a non-DART case.
Recording and correctly classifying DART cases affects the accuracy of an
establishment’s combined DART injury and illness rate, which is a rate that OSHA uses for
targeting purposes. (In more recent years, OSHA also has been using the establishment’s daysaway-from-work case rate in conjunction with the DART rate for targeting.) Other types of
recording errors, such as incorrect day counts or an injury recorded as an illness, were not
analyzed because they do not affect the calculation for either the DART injury and illness rate or
the days-away-from-work rate.
OSHA examined the overrecording of cases in regard to the universe estimates as a
separate step. Overrecorded cases are those cases found on the employer’s Log that the auditor
has determined are non-recordable based on a review of employee records during the audit (e.g.,
an injury occurred but only required first aid).
A case-level analysis looked at the number and percent of establishments with particular
types of injury and illness case recording results. The types of underrecording errors for total
recordable and DART cases reconstructed in the sample were also determined. The numbers in
the case-level analysis are unweighted and are not intended for conclusions about the universe of
establishments. The information suggests relative distributions of the type of recording errors,
but would require additional study or a redesigned, larger sample for future audits to fully
interpret their significance.
Summary of Findings
Overall Accuracy of Employer Recordkeeping. The percent of establishments classified
with accurate recordkeeping (at-or-above the 95 percent threshold) is above 96 percent for both
total recordable and DART injury and illness cases. Based on 95 percent confidence intervals for
the two estimates, the percentages of 98.34 percent for total recordable cases and 96.27 percent
for DART cases are not statistically different. Overall, the universe estimates for this year are
consistent with the level of accuracy observed for employer injury and illness recordkeeping
over previous years of the audit program. OSHA applied a statistical test to the accuracy
estimates for CY 2006 and CY 2005 and found no significant difference in the means for either
total recordable or DART cases. Among manufacturing and non-manufacturing, the overall
percent of establishments below the threshold of accuracy was similar for total recordable and
DART cases.
†
Because the auditors did not find any cases of underrecorded or misrecorded fatalities in the sample, no analysis
was required for this type of case. Auditors did find two fatality cases correctly recorded on the Log.
Final Report, November 2009
ES-4
Case analysis. In the sample of establishments, non-DART cases were the cases most
frequently not recorded on the Log for injuries. This was followed by cases only involving days
away from work (DAFW). For illnesses, only one unrecorded case (restricted work activity or
transfer case) was found by auditors.
Submission Comparison Analysis. DART cases had the highest percent of establishments
with exactly the same data found on the Log and submitted to OSHA for the ODI. For hours
worked, the audits found slightly more hours worked for firms in the “medium” category than for
the other size groups.
Summary and Recommendations
Summary. This analysis represents the eleventh year of OSHA’s audit program on
employer injury and illness recordkeeping. The audit program is well established and the
protocol operates efficiently.
Across all of the years of the program, a number of findings remain consistent:
•
Based on the estimates of the accuracy of employer injury and illness recordkeeping,
the OSHA Log and employment data collected through the ODI represent reasonable
quality for OSHA’s targeting and performance measurement purposes.
•
Both some overrecording and underrecording are observed.
•
Underrecording errors are not widely distributed across the sample of establishments.
A small number of establishments account for most of the underrecorded cases.
•
Differences found in comparing the audit data with the data submitted to OSHA
result in very few changes of the inspection targeting category status of
establishments.
Findings this year on the CY 2006 employer injury and illness recordkeeping are:
•
Audits available for the analysis. After following the sample selection steps, a total
of 241 audits were available for use in the universe estimates, case-level analysis, and
comparison of onsite and submitted data. (This year OSHA did not reach the
methodology’s target of 250 establishment audits available for conducting the
analysis.)
•
Distribution of audit establishments across the standard universe. Overall the
sample of audited establishments appears representative of the standard ODI universe
by industry at the 2-digit SIC level, reflecting the effect of implicit stratification.
•
Recordkeeping accuracy universe estimates. Generalizing from the sample of
establishments audited for CY 2006 recordkeeping, the percent of establishments
Final Report, November 2009
ES-5
classified with accurate recordkeeping (at-or-above the 95 percent threshold) for the
standard ODI universe is above 96 percent for both total recordable injury/illness
cases (98.34%, SE 0.82 %) and DART injury/illness cases (96.27%, SE 1.22%).
Further, based on 95 percent confidence intervals for the two estimates, the
percentages for total recordable and DART are not statistically different.
As a separate step in the universe estimates analysis, OSHA also examines the
overrecording of cases (i.e., cases found on the employer’s Log that the auditor has
determined are non-recordable based on a review of employee records during the
audit). Overall, this year’s results on the overrecording of cases are consistent with
the level observed previously.
Recommendations
1. OSHA should continue the audit program with its established process as a quality
control mechanism to ensure that the acceptable level of accuracy in employer
injury/illness recordkeeping for the ODI data collection is maintained.
2. OSHA should continue to use the information from the audit analysis in outreach
efforts to promote improvements in employer injury and illness recordkeeping, with
an emphasis on the correct recording of DART cases. For example, this report or
summaries of the findings should be made available to Agency compliance officers
conducting the recordkeeping audits. In addition, this information should be provided
to compliance officers conducting the recordkeeping inspections under the Injury and
Illness Recordkeeping National Emphasis Program (RK NEP), since these
inspections follow procedures very similar to the protocol for the audit program.
3. A further refinement OSHA should consider is to shift from SIC codes to the North
American Industrial Classification System (NAICS) for compiling the standard
universe that the Agency uses for selecting the annual sample of audit establishments.
Consideration of this change should involve assessing possible effects on the audit
program methodology.
4. In keeping with a recent recommendation from the Government Accountability
Office (GAO) to make the optional employee interview component of the audit
process mandatory, OSHA should consider potential issues associated with
conducting worker interviews about possible past events (such as employee turnover
and possible memory-effects biases) that present challenges for data accuracy. For
example, OSHA should consider interviewing workers about any injuries or illnesses
that occurred in both the reference year and in the most recent calendar year,
respectively. The compliance officer would then include a review of the Log for the
most recent year if any incidents were identified in the interviews.
Final Report, November 2009
ES-6
INTRODUCTION
In 1995, the Occupational Safety and Health Administration (OSHA) established its Data
Initiative Collection System (ODI) to gather and compile occupational injury and acute illness
information from some 80,000 establishments in high-hazard industries. At the same time, the
Agency developed mechanisms to ensure the accuracy of the collected ODI data for OSHA’s
use—particularly in combination with other data sources—for targeting enforcement and
compliance assistance interventions. OSHA’s ongoing data quality efforts address both the data
collection process and the source records (i.e., employer recordkeeping on the OSHA 300 Log)
as an integral part of the ODI. (Appendix A lists audit program analyses, data validation study
reports, and related studies conducted to date.)
OSHA established the audit program with its onsite audits of employer injury and illness
records to annually assess and monitor the quality of employer injury/illness recordkeeping
nationwide.1 (Appendix B describes OSHA’s initial quality control efforts and provides
background on the development of the audit program.) The audit program has focused only on
non-construction establishments, with the exception of the sixth year of the program when
OSHA conducted a pilot of the audit methodology in a sample of construction establishments.
Budget constraints have precluded implementation of the audit program in construction
establishments.
OSHA considers onsite audits of employer injury and illness records a key method of
verifying the accuracy of data submitted for the ODI and for estimating the extent of employer
compliance with OSHA recordkeeping requirements defined in 29 CFR 1904. In order to
implement this quality control component, OSHA developed a protocol for reviewing a sample
of employee injury/illness records within a sample of establishments (see Appendix C) as well as
software to streamline a process that was otherwise too resource intensive for widespread use.
This report presents findings on the analysis of audits on calendar year (CY) 2006
employer injury/illness recordkeeping. It is the eleventh audit program analysis.
AUDITS OBJECTIVE
The primary objective for OSHA in the eleventh year of the audit program was to
estimate CY 2006 employer injury/illness recordkeeping accuracy nationwide based on OSHA
recordkeeping audits conducted according to an established protocol at a sample of nonconstruction establishments drawn from the standard ODI universe.
In the sections that follow, OSHA presents its methodology, analytical approach, and
findings in regard to these objectives using the information gathered during audits on CY 2006
recordkeeping. The final section of the report provides a summary of findings and
recommendations based on the study.
1
This report represents the reporting-year analysis of a three-year analysis cycle that includes two interim-year
summary analyses followed by a comprehensive report for submission to the Office of Management and Budget
(OMB).
Final Report, November 2009
1
AUDIT METHODOLOGY AND ANALYTICAL APPROACH
The methodology for the analysis covers efforts to maintain the level of audit program
participation experienced over most years, the implementation of sample selection from a
standard ODI universe that allows for generalizing the estimate of overall recordkeeping
accuracy to ODI establishments nationwide and facilitates year-to-year comparisons, and the
continued emphasis on adherence to the protocol’s procedures for conducting the audits.
State Plan State Participation
OSHA invites State Plan States to participate in the audit program on a voluntary basis.
Based on audit program experience, OSHA assumes that about ten States will be able to
participate in a particular year, with some year-to-year variation. This time, the number of States
participating in the program was six, which is five fewer than last time. All six of the States
(California, Iowa, Kentucky, Maryland, Minnesota, and Virginia) participated in the program last
year. Five of the 11 States that participated last year (Arizona, Indiana, New Mexico, North
Carolina, and Utah) opted out this time.2 OSHA notes that State Plan State participation is back
up to ten for the recordkeeping audit cycle currently under way for reviewing CY 2007
injury/illness data.
Despite the drop this year in State Plan State participation, overall the sample of audited
establishments is representative of the standard ODI universe by industry (see Table 2 note on
OSHA’s further evaluation of fit between the audit sample and the universe of establishments by
industry).
Sampling Universe
This was the eighth year in which OSHA implemented the audit program by selecting a
sample of audit establishments from a standard ODI universe.3 For each year, OSHA compiles
the standard ODI universe using a file from Dun & Bradstreet that provides the most currently
available industry, employment, and location information on establishments. OSHA defines a
standard ODI universe to be able to generalize the annual estimates of overall accuracy for
employer injury and illness recordkeeping to ODI establishments nationwide and to facilitate
year-to-year comparisons.
For this year of the program, OSHA again selected establishments from a universe that
covered industries included in all years of the ODI. More specifically, OSHA used a standard
ODI universe that included approximately 117,000 establishments nationwide that met the
following criteria:
2
Of the 23 State Plan States overall, 6 of them have decided not to participate in the ODI. Also, the Commonwealth
of Puerto Rico and the Virgin Islands (a U.S. Territory) are considered ineligible for participation in the ODI.
Another 11 State Plan States (including the five noted above) chose not to participate in this year’s audit program.
3
The objective is to addresses analytical limitations associated with selecting a sample from the collection yearspecific ODI universe, which is subject to shifting characteristics. In the initial years of the audit program, the
sample was selected from a universe of establishments participating in the ODI in a specific year.
Final Report, November 2009
2
•
Establishment is located in one of the States participating in the ODI (i.e., either in
the Federal OSHA jurisdiction or in one of the participating State Plan States).
•
Establishment has total employment of 40 or more.
•
Establishment is in one of the Standard Industrial Classification (SIC) codes selected
for any of the annual ODI collections.4
Sample Selection of Establishments
As in other years of the audit program, OSHA committed to conducting 250 audits.
Previous analyses have established that selecting and assigning a sample of exactly 250 audits at
the outset is unlikely to yield the optimum number of completed audits for the analysis. A
shortfall can result because in some instances audits are not conducted due to constraints on
resources.
The target sample size is based on a National Opinion Research Center (NORC)
determination that this approximate number of audits would provide an acceptable level of
power for detecting overall accuracy of employer recordkeeping at-or-above a 95 percent
threshold. This also would enable OSHA to provide reasonable estimates of accuracy for the
universe of establishments. As established for the previous audit program analyses, at lower level
break-outs, such as at the industry level, universe estimates would be considered unstable
because of the relatively small number of establishments that might occur in the subcategories of
the sample. (See National Opinion Research Center, Final Report: Sample Design for a
Statistically Valid Evaluation of Accuracy and Completeness of an Establishment’s OSHAMandated Employee Records, 1996.)
To select a sample of audit establishments from a standard ODI universe and to increase
the likelihood of having 250 completed audits available for the analysis, OSHA implemented the
following steps:
Step 1. Select an initial sample of establishments from the standard ODI universe.
OSHA made an initial selection of 399 establishments from a standard ODI universe file
that was compiled from a Dun & Bradstreet establishments file. This sample selection file
included all 117,306 establishments that met the criteria established for the audit
program’s standard ODI universe. Before making this initial selection, OSHA sorted
establishments in the sampling frame by industry code, region, and employment size,
4
Several program cycles ago, OSHA modified the criteria somewhat by including establishments in SIC codes from
any of the ODI collections except SIC 53 (General Merchandise Stores) and SIC 806 (Hospitals). OSHA made this
refinement to the definition of the standard ODI universe to address the possibility that the number of establishments
in these large industry sectors, which are only selectively included in the ODI, could affect the overall
representativeness of the audit sample selection. (Note that the standard ODI universe is based on SIC codes for
consistency with the recordkeeping rule, which currently is defined by SIC codes.)
Final Report, November 2009
3
resulting in an implicit stratification. OSHA then drew the sample of establishments
using a systematic selection procedure.
Step 2. Include all establishments selected for the initial sample in the ODI universe
for the CY 2006 collection year.
OSHA included all 399 establishments selected from the standard ODI universe in the
CY 2006 ODI collection universe.
Step 3. At completion of the ODI data collection cycle for CY 2006, eliminate from
the sample any establishments that do not meet audit program requirements.
After the CY 2006 ODI collection cycle was completed, OSHA screened from the sample
any establishments located in State Plan States that had chosen not to participate in the
audit program. From those that remained, any establishments for which OSHA did not
have an OK-verified submission from the CY 2006 collection were screened out. (OSHA
submission tracking codes that indicate the data are OK verified are: OK, OKPD, and
ECRG. See Appendix D for a glossary of tracking codes.) As a result, 139 establishments
were eliminated from the sample in this step.
Step 4. Assign the remaining sample establishments for an audit.
OSHA assigned 260 establishments for an audit. When any of the original audit
establishment selections could not be audited (e.g., when found to be out-of-business or
to be a headquarters location), replacement establishments were selected from the
collection year CY 2006 ODI universe. An establishment could be selected as a
replacement if it was in the same jurisdiction as the original selection, it matched on the
industry code, and the average number of employees was the same or similar.
Step 5. Eliminate any completed audits that were not properly conducted.
As files for audits that auditors were able to conduct and complete were submitted,
OSHA reviewed the files and determined which ones followed requirements in the
recordkeeping protocol (see Appendix C). Based on this review, OSHA eliminated 1
audit due to an out-of-scope SIC code. (18 of the assigned audits were not conducted.)
Audit Protocol and Sampling of Employees within Establishments
The same approach to sampling employees within establishments and essentially the
same protocol were used this time as in past years of the audit program. (Appendix C presents
OSHA’s compliance instruction on recordkeeping audits.) Furthermore, OSHA maintained an
emphasis on adherence to the protocol in its training for staff conducting the audits.
In analyzing the recordkeeping audit program, OSHA has found that the audit protocol
establishes an efficient approach for conducting and documenting recordkeeping audits.
Adherence to the protocol and use of the ORAA software system provide auditors with an
Final Report, November 2009
4
efficient process that allows the Agency to feasibly monitor the quality of employer injury and
illness recordkeeping.
An important feature of the ORAA software is the built-in function that enables the
auditor to determine the number of employees to be sampled at each establishment. After the
auditor enters the number of employees at the establishment and the number of cases on the
employer’s OSHA 300 Log, the software calculates the number of employees to be sampled.
This sample is based on certain assumptions about the occurrence of recordable injuries and
illnesses, the level of recording accuracy, and the likelihood of detecting errors in recording.
Statistical assumptions that were established to determine the sample size included a threshold of
accuracy of 95 percent, an alpha level of 0.05, and a power of 75 percent. (A full discussion of
the statistical power analysis can be found in the National Opinion Research Center Final
Report: Sample Design for a Statistically Valid Evaluation of Accuracy and Completeness of an
Establishment’s OSHA-Mandated Employee Records—see especially pp.4-6.)5
Analysis
OSHA implemented the same general approach for analyzing the results of the
establishment audits as was used in past years of the program. The analysis approach addressed
two general areas:
Methodology for Implementing the Audit Cycle
C
Reviewing the documentation on the audits for completeness and adherence to the
established protocol.
C
Comparing the characteristics of the sample of establishments audited to those of
establishments in the standard ODI universe.
Results Related to the Accuracy of Employer Injury/Illness Recordkeeping
C
Calculating universe estimates of the overall accuracy of employer injury and illness
recordkeeping based on the results of the audits and the sample design.
C
Comparing recordkeeping accuracy estimates from the eleventh-year audit program
with results from the tenth year.
C
Performing a case-level analysis that describes the types of recordable cases the
auditors identified in the sample and details the recording errors they discovered.
5
Although the audit program is well established and the protocol operates efficiently, during the interim years of the
audit program reporting cycle, OSHA revisited assumptions regarding parameters for sampling employees within an
establishment—in keeping with a previous analysis report recommendation—and determined a minor adjustment
would keep the establishment sampling methodology current with generally reported national trends in workplace
injury and illness incidence rates. OSHA then implemented the adjustment for the version of the audit software
released for recordkeeping audits on CY 2007 employer recordkeeping.
Final Report, November 2009
5
C
Comparing the employers’ Log Summary and employment and hours worked data at
the establishment at the time of the audit with the data submitted to OSHA in
response to the CY 2006 ODI collection request.
Approach for Analysis of the Implementation of the Audit Cycle
The compliance officers’ documentation of the audits was carefully reviewed to
confirm the procedures used in the audit. A total of 241 audits was usable for the universe
estimates, the case-level analysis, and the comparisons made between data on the Log and data
submitted to OSHA for the total recordable cases, DART cases, and hours worked. (This number
of establishment audits available for the analysis is consistent with last year’s analysis that
included 245 establishments.) As in the past, the primary reason for not conducting some of the
audits was resource constraints. Of the audits that were conducted, 1 was excluded based on
OSHA’s review of the documentation for each audit to determine whether auditors had fully
followed the protocol or if an audit should be eliminated for any other reason.
The sample of establishments audited was compared to the standard ODI universe
of establishments by size and industry to determine the representativeness of the sample.
Three principal size group categories based on average employment were used—“all small” (4099 employees), medium (100-249 employees), and large ($250 employees). Also, as with the
past eight audit program analyses, a small establishments subcategory of 40-49 employees was
used to continue to assess any effect of the inclusion of smaller establishments in the ODI.
For industry matching, the sample and universe were compared at the 2-digit SIC level.
Also, comparisons were developed for all manufacturing and non-manufacturing establishments.
Table 1 provides the distribution of audited establishments by size group based on
average employment compared to the standard ODI universe. Sample establishments were
selected from this universe and assigned for an audit if the establishment was in the Federal
OSHA jurisdiction or in one of the six State Plan States participating in this year’s audit
program, and if the establishment’s ODI submission for CY 2006 was OK verified.
Final Report, November 2009
6
Table 1
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Establishments in the Recordkeeping Audit Sample and
the Standard ODI Universe by Establishment Size Group
Establishment Size Group
(average number of
employees)
Audit Samplea
Establishments
Standard ODI Universeb
Establishments
Number
Percentc
of Sample
Number
Percentc
of Universe
All Small (40-99)d
90
37.35
65,142
55.53
Medium (100-249)
114
47.30
36,958
31.51
Large ($250)
37
15.35
15,206
12.96
241
100
117,306
100
All Sizes
Note: OSHA could not assess the audit sample’s representativeness of the universe based on the size category
breakouts presented here. As pointed out by Hays, W.L., in Statistics (5th ed. 1994, Harcourt Brace & Co.),
Pearson’s Chi-Square test would not provide a reliable assessment of goodness of fit, given that only three size
categories are available. This test provides a reasonable approximation only when the number of categories available
for conducting the comparison—size or industry categories in this analysis—is reasonably large.
a. The audit sample is limited to establishment audits that OSHA assigned from the original sample of
establishments, as drawn from the standard ODI universe, and that OSHA determined were usable for the analysis
after confirming that the audits were conducted according to established recordkeeping audit procedures (see CPL in
Appendix C). Establishments in the original sample were assigned for an audit if they were under the OSHA Federal
jurisdiction or in one of the six State Plan States that voluntarily participated in the audit program, and if their CY
2006 OSHA Data Initiative (ODI) submission was OK verified. For the comparison in Table 1, establishment size
group information for the audit sample establishments was derived from the employer-submitted 2006 ODI data.
b. The standard ODI universe includes all establishments that are in States participating in the ODI, have 40 or more
employees, and are in one of the SICs selected for any of the ODI collectionsCexcept SIC 53 (General Merchandise
Stores ) and SIC 806 (Hospitals). Because OSHA has not collected ODI data from all establishments in the standard
ODI universe, for the comparison in Table 1, establishment size group information for establishments in the
standard ODI universe was derived from Dun & Bradstreet data.
c. Because of rounding, percentages may not add to 100.
d. The “all small” size group includes a subset grouping of 12 “small” establishments with 40 to 49 employees. This
grouping represents 4.98 percent of the sample and 16.24 percent of the universe.
Final Report, November 2009
7
The same group of audited establishments presented in Table 1 is compared to the
universe by industry in Table 2 at the 2-digit SIC level. The bottom of Table 2 also presents the
comparison of all manufacturing and non-manufacturing establishments.
Table 2
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Establishments in the Recordkeeping Audit Sample and
the Standard ODI Universe by Industry (2-digit SIC) Sorted by
Number of Establishments in the Universe
Audit Samplea
Establishments
SIC Code (2-digit level) and Industry
Number
Percentc
of Sample
Standard ODI
Universeb
Establishments
Percentc
Number
of
Universe
80
Health Services
39
16.18
14,113
12.03
42
Trucking and Warehousing
15
6.22
9,800
8.35
35
Machinery, Except Electrical
16
6.64
8,884
7.57
34
Fabricated Metal Products
13
5.39
8,076
6.88
27
Printing and Publishing
15
6.22
6,831
5.82
20
Food and Kindred Products
15
6.22
6,207
5.29
36
Electric and Electronic Equipment
15
6.22
6,112
5.21
30
Rubber and Misc. Plastics Products
12
4.98
5,160
4.40
51
Wholesale Trade-Nondurable Goods
12
4.98
4,926
4.20
50
Wholesale Trade-Durable Goods
8
3.32
4,820
4.11
28
Chemicals and Allied Products
9
3.73
4,603
3.92
52
Building Materials & Garden Supplies
10
4.15
4,064
3.46
37
Transportation Equipment
6
2.49
3,855
3.29
38
Instruments and Related Products
4
1.66
3,530
3.01
24
Lumber and Wood Products
4
1.66
3,445
2.94
26
Paper and Allied Products
8
3.32
3,192
2.72
32
Stone, Clay, and Glass Products
4
1.66
2,994
2.55
33
Primary Metal Industries
8
3.32
2,776
2.37
25
Furniture and Fixtures
6
2.49
2,144
1.83
23
Apparel and Other Textile Products
3
1.24
2,138
1.82
39
Misc. Manufacturing Industries
6
2.49
2,111
1.80
22
Textile Mill Products
3
1.24
1,701
1.45
Final Report, November 2009
8
Audit Samplea
Establishments
SIC Code (2-digit level) and Industry
Number
Percentc
of Sample
Standard ODI
Universeb
Establishments
Percentc
Number
of
Universe
49
Electric, Gas, and Sanitary Services
2
0.83
1,541
1.31
45
Transportation by Air
6
2.49
1,532
1.31
01
Agricultural Production-Crops
0
0
546
0.47
29
Petroleum and Coal Products
1
0.41
505
0.43
02
Agricultural Production-Livestock
0
0
469
0.40
44
Water Transportation
0
0
408
0.35
31
Leather and Leather Products
0
0
294
0.25
07
Agricultural Services
0
0
165
0.14
43
United States Postal Service
0
0
146
0.12
47
Transportation Services
0
0
127
0.11
21
Tobacco Manufacturers
0
0
91
0.08
54
Food Stores
All Manufacturing SICs
1
0.41
0
0
148
61.41
71,014
60.54
All Non-Manufacturing SICs
93
38.59
46,292
39.46
All SICs
241
100
117,306
100
Note on representativeness of sample: Overall, the sample of audited establishments appears representative of the
standard ODI universe by industry, reflecting the effect of implicit stratification. OSHA further evaluated and
supported this finding with Pearson’s Chi-Square test for goodness of fit, using the many more categories available
for this comparison than for the size category comparison. In applying the test, no significant deviations from fit
were observed (Chi-Square = 20.67, df = 30, n.s.).
a. The audit sample is limited to establishment audits that OSHA assigned from the original sample of
establishments, as drawn from the standard ODI universe, and that OSHA determined were usable for the analysis
after confirming that the audits were conducted according to established recordkeeping audit procedures (see CPL in
Appendix C). Establishments in the original sample were assigned for an audit if they were under the OSHA Federal
jurisdiction or in one of the six State Plan States that voluntarily participated in the audit program, and if their CY
2006 OSHA Data Initiative (ODI) submission was OK verified. For the comparison in Table 2, establishment
industry information for the audit sample establishments was derived from the employer-submitted 2006 ODI data.
b. The standard ODI universe includes all establishments that are in States participating in the ODI, have 40 or more
employees, and are in one of the SICs selected for any of the ODI collectionsCexcept SIC 53 (General Merchandise
Stores ) and SIC 806 (Hospitals). Because OSHA has not collected ODI data from all establishments in the standard
ODI universe, for the comparison in Table 2, industry information for establishments in the standard ODI universe
was derived from Dun & Bradstreet data.
c. Because of rounding, percentages may not add to 100.
Final Report, November 2009
9
Approach for Analysis of Results Related to the Accuracy of Injury/Illness Recordkeeping
The universe estimate analysis focused on the types of recording errors that affect
an employer’s injury and illness rate, including:6
C
Underrecording of total recordable cases—The employer does not record an injury or
illness that should have been entered on the Log.
C
Underrecording or misrecording of DART cases (days away from work, restriction,
or transfer injury/illness cases)—Either the case is not recorded on the Log or the case
is recorded as a non-DART case.
Recording and correctly classifying DART cases affects the accuracy of an
establishment’s combined DART injury and illness rate, which is a rate that OSHA uses for
targeting purposes. (In more recent years, OSHA also has been using the establishment’s daysaway-from-work case rate in conjunction with the DART rate for targeting.) Other types of
recording errors, such as incorrect day counts or an injury recorded as an illness, were not
analyzed because they do not affect the calculation for either the DART injury and illness rate or
the days-away-from-work rate.
The same steps used in past years’ analyses were involved in classifying an establishment
as accurate in the recording of total recordable cases and the recording of DART cases on the
Log. Estimates of the percent of establishments with accurate recording of these cases are based
on the sample design for both the selection of establishments and the sampling of employees
within establishments. The steps are as follows:
Step 1.
A significance test was applied to the results of the sample of employee records
reviewed for each audit to determine whether an establishment should be
classified as at-or-above a 95 percent threshold of accuracy. (See National
Opinion Research Center, Final Report: Sample Design for a Statistically Valid
Evaluation of Accuracy and Completeness of an Establishment’s OSHAMandated Employee Records, 1996, page 5 for an explanation of the threshold
of accuracy.)
Step 2.
The percent of sample establishments at-or-above the 95 percent threshold of
accuracy was calculated. The sample percent provides an estimate of the
proportion of establishments at-or-above the 95 percent threshold of accuracy in
the standard ODI universe. The projection to this universe is valid because of
the implicit stratified sample design for the sample of establishments.
Step 3.
A standard error of the percent estimate was calculated using the simple random
sampling variance estimator.
6
Because the auditors did not find any cases of underrecorded or misrecorded fatalities in the sample, no analysis
was required for this type of case. Auditors did find two fatality cases correctly recorded on the Log.
Final Report, November 2009
10
Universe estimates for any given year, however, cannot be generalized to all of the
nation’s workplaces for the following reasons:
C
The ODI focuses on selected high-rate industries and excludes establishments with
fewer than 40 employees.
C
Not all State Plan States participate in the ODI or the audit program.
Additional analyses would need to be conducted before such use of the estimates could be
supported.
OSHA examined the overrecording of cases in regard to the universe estimates as a
separate step. Overrecorded cases are those cases found on the employer’s Log that the auditor
has determined are non-recordable based on a review of employee records during the audit. For
example, an injury occurred but only required first aid.
See the Findings section for the results of the universe estimates analysis.
A case-level analysis looked at the number and percent of establishments with
particular types of injury and illness case recording results. The types of underrecording
errors for total recordable and DART cases reconstructed in the sample were also determined.
The numbers in the case-level analysis are unweighted and are not intended for conclusions
about the universe of establishments. The information suggests relative distributions of the type
of recording errors, but would require additional study or a redesigned, larger sample for future
audits to fully interpret their significance. See the Findings section for the results of this analysis.
The employer’s Log Summary at the establishment was compared with the data
submitted to OSHA. Comparisons were made between data on the Log and submitted data for
the total recordable cases, DART cases, and hours worked data by size group and by
manufacturing versus non-manufacturing establishments in the universe. The analysis also
looked at the reasons for the differences between data on the Log and submitted data. The
ORAA software includes a pick-list of reasons provided by establishment recordkeepers and the
capability to distinguish between primary and secondary reasons for differences.
This component of the study used the same 241 audits that were available for use in the
universe estimate and the case-level analyses. See the Findings section for the results of this
analysis.
FINDINGS
This section presents the results related to the accuracy of employer injury and illness
recordkeeping. The assessment includes summary indicators for the universe of establishments,
the types of recordkeeping errors that auditors identified in the sample, and a comparison of the
injury/illness and employment data submitted for the ODI collection with that maintained at the
establishment.
Final Report, November 2009
11
Universe Estimates for CY 2006 Recordkeeping
The primary objective of the audits is to derive estimates of the overall accuracy of
employer injury and illness recordkeeping (as previously defined). In the first three years of the
audit program, the sample results could be applied only to the sampling universe made up of
establishments that were in the ODI universe for the specific collection year and that were
participating in the audit program.
As in more recent years, OSHA again selected a sample from a universe that is
representative of nearly all establishments nationwide included in the ODI. An exception to the
sample’s representativeness of all ODI establishments was established by a refinement OSHA
made a number of years ago to the standard universe. The change involved excluding two
industries—SIC 53 (General Merchandise Stores) and SIC 806 (Hospitals)—for which OSHA
collects Log summary data and employment information from only a portion of the population of
establishments. For other industries, OSHA collects data from the entire population of
establishments that meet ODI criteria. OSHA made the adjustment to consider the possibility that
the population size of these industry sectors (about 10,000 establishments each) could affect the
overall representativeness of the audit sample selection.
Universe estimates for any given year cannot be generalized to all of the nation’s
workplaces because the ODI focuses on selected high-rate industries and excludes
establishments with fewer than 40 employees. Also, not all State Plan States participate in the
ODI or the audit program. Additional analyses would need to be conducted before such use of
the estimates could be supported.
The sample of establishments and the sample of employees within establishments was
designed to allow a reasonable estimation of the extent to which employers enter recordable
cases on their Logs (the extent to which cases are not underrecorded) or correctly classify DART
cases. This year, two fatality cases were identified by auditors in the sample of establishments,
representing the only fatality cases since a first case was identified in audits on CY 2000
recordkeeping.
Table 3 provides the results of the universe estimates analysis for CY 2006
recordkeeping. Generalizing from the sample of audit establishments, the percent of
establishments classified with accurate recordkeeping (at-or-above the 95 percent threshold) is
above 96 percent for both total recordable and DART injury and illness cases. Based on 95
percent confidence intervals for the two estimates, the percentages of 98.34 percent for total
recordable cases and 96.27 percent for DART cases are not statistically different.
The universe estimates for this year are consistent with the level of accuracy observed for
employer injury and illness recordkeeping over previous years of the audit program. OSHA
applied a statistical test to the accuracy estimates for CY 2006 and CY 2005, which is the lower
of the two previous years shown in Table 4, and found no significant difference in the means for
either total recordable or DART cases.
Final Report, November 2009
12
Type of Case
Table 3
Universe Estimates for OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent* of Establishments Classified as Accurate in Recording the Number of
Total Recordable and Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
with the Standard Error of the Estimate
2006 AUDIT RESULTS
Number of establishments
Percent of establishments
classified with accurate recording
classified with accurate recording
Standard error of the
(at-or-above the 95% threshold of (at-or-above the 95% threshold of
estimate (percent)
accuracy)
accuracy)
Total Recordable
237 / 241
(4 below)
98.34%
0.82%
DART
232 / 241
(9 below)
96.27%
1.22%
* The percent of establishments “at or above” the 95% threshold of accuracy calculated from the sample also provides an estimate that can be extrapolated to the
standard ODI universe (i.e., establishments nationwide that are in States participating in the ODI, have 40 or more employees, and are in one of the SICs selected
for any of the ODI collections—except SIC 53 (General Merchandise Stores ) and SIC 806 (Hospitals)).
Note: The standard error of the estimate was calculated using the simple random sampling variance estimator.
Final Report, November 2009
13
Table 4
Universe Estimates for OSHA Audits on CY 2004 and CY 2005 Recordkeeping:
Number and Percent* of Establishments Classified as Accurate in Recording the Number of
Total Recordable and Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
with the Standard Error of the Estimate
Type of Case
2004 AUDIT RESULTS
Number of
Percent of
establishments
establishments
classified with
classified with
Standard error
accurate
accurate
of the estimate
recording (at-or- recording (at-or(percent)
above the 95%
above the 95%
threshold of
threshold of
accuracy)
accuracy)
2005 AUDIT RESULTS
Number of
Percent of
establishments
establishments
classified with
classified with
Standard error
accurate
accurate
of the estimate
recording (at-or- recording (at-or(percent)
above the 95%
above the 95%
threshold of
threshold of
accuracy)
accuracy)
Total
Recordable
245 / 256
(11 below)
95.70%
1.26%
232 / 245
(13 below)
94.69%
1.43%
DART
244 / 256
(12 below)
95.31%
1.32%
229 / 245
(16 below)
93.47%
1.57%
* The percent of establishments “at or above” the 95% threshold of accuracy calculated from the sample also provides an estimate that can be extrapolated to the
standard ODI universe (i.e., establishments nationwide that are in States participating in the ODI, have 40 or more employees, and are in one of the SICs selected
for any of the ODI collections—except SIC 53 (General Merchandise Stores ) and SIC 806 (Hospitals)).
Note: The standard error of the estimate was calculated using the simple random sampling variance estimator.
Final Report, November 2009
14
Tables 5 and 6 show the distribution of establishments that fell below the 95 percent
threshold of accuracy by establishment size and industry category for total recordable and DART
cases, respectively. For both total recordable and DART cases, the overall percent of
establishments below the threshold of accuracy was similar between manufacturing and nonmanufacturing establishments.
Compared to audits on CY 2005 recordkeeping, both manufacturing and nonmanufacturing establishments did better this year in recording both total recordable and DART
cases. Last year, the overall percent of establishments below the threshold of accuracy for total
recordable was 5.81 and 4.44, respectively between manufacturing and non-manufacturing
establishments. For DART, the overall percents were 5.81 and 7.78.
Final Report, November 2009
15
Table 5
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Establishments Below the Threshold of Accuracy for
Total Recordable Cases by Establishment Size Group
and Manufacturing vs. Non-Manufacturing
Industry Category
Establishment Size
Category (average
number of employees)
Manufacturing
Non-Manufacturing
Number
Percent
Number
Percent
1 / 56
1.79
0 / 34
0.00
0/8
0.00
0/4
0.00
Medium (100-249)
1 / 65
1.54
1 / 49
2.04
Large (≥250)
1 / 27
3.70
0 / 10
0.00
3 / 148
(145 pass / 148)
2.03
1 / 93
(92 pass / 93)
1.08
All Small (40-99)
Small (40-49)*
Total
* The “small” size group is a subset of the “all small” size group.
Final Report, November 2009
16
Table 6
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Establishments Below the Threshold of Accuracy for
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
by Establishment Size Group
and Manufacturing vs. Non-Manufacturing
Industry Category
Establishment Size
Category (average
number of employees)
Manufacturing
Non-Manufacturing
Number
Percent
Number
Percent
1 / 56
1.79
0 / 34
0.00
0/8
0.00
0/4
0.00
Medium (100-249)
4 / 65
6.15
2 / 49
4.08
Large (≥250)
1 / 27
3.70
1 / 10
10.00
All Small (40-99)
Small (40-49)*
6 / 148
Total
4.05
(142 pass / 148)
3 / 93
(90 pass / 93)
3.23
* The “small” size group is a subset of the “all small” size group.
In examining the overrecording of cases (i.e., cases classified as non-recordable found by
the auditor on the Log) in regard to the universe estimates, OSHA found the following:
•
Overall. A total of 76 entries (75 injuries and 1 entry that did not indicate either injury or
illness) were found on employers’ Logs for incidents that are not considered OSHArecordable cases. These overrecorded cases were distributed across 51 establishments. At
37 of these 51 establishments, only one instance of overrecording was found.
Only 12 of these 76 overrecorded cases were classified as DART cases by employers.
These 12 overrecorded DART cases were distributed across 8 establishments.
•
Total recordable cases. Overall, 233 of 241 (96.68%) establishments were at-or-above the
95 percent threshold of accuracy with respect to overrecording.
Of the 237 establishments at-or-above the 95 percent threshold of accuracy with respect to
underrecording of recordable cases, 229 (96.62%) were found to be at-or-above the
threshold with respect to overrecording. None of the four establishments below the 95
percent threshold of accuracy with respect to underrecording tested below the 95 percent
threshold of accuracy for overrecording.
Final Report, November 2009
17
•
DART cases. Overall, 239 of 241 (99.17%) establishments were at-or-above the 95
percent threshold of accuracy with respect to overrecording.
Of the 232 establishments at-or-above the 95 percent threshold of accuracy with respect to
underrecording of DART cases, 230 (99.14%) were found to be at-or-above the threshold
with respect to overrecording.
None of the nine establishments below the 95 percent threshold of accuracy for DART
underrecording tested below the 95 percent threshold of accuracy for overrecording.
Case Analysis
The distribution of cases was analyzed to provide descriptive information about the
auditors’ findings in the sample of establishments. The data are raw frequencies of the
reconstructed cases from the audits. The analysis of cases by establishments is different from the
determination of the universe estimates in that the sample size and design did not provide for
estimates at this level of detail. The breakdown of different types of cases identified by the
auditors are not weighted by their respective contribution to the sample. As a result, broad
conclusions cannot be drawn about the universe from these findings.
Table 7 indicates the type of recordkeeping errors that the auditors identified in the
discovered cases. In the sample of establishments, the percentage of cases not recorded at all was
higher than the percentage of errors involving either DART cases recorded as non-DART cases
or non-DART cases recorded as DART cases. More DART cases recorded as non-DART cases
were found than non-DART cases recorded as DART cases. The analysis found, however, that
these recordkeeping errors are not widely distributed across the audit sample. For instance, 5
establishments (with a total of 13 cases) accounted for over 46 percent of the 28 underrecorded
DART cases found by auditors. Similarly, for the approximately 48 percent of errors attributable
to not recording cases, 5 establishments (with a total of 11 cases) accounted for almost 35
percent of the 32 cases found by auditors that were not recorded on the employer Logs.
Table 8 shows the types of injury and illness cases identified by the auditors that were not
recorded on the employer Logs. In the sample of establishments, non-DART cases were the
cases most frequently not recorded on the Log for injuries. This was followed by cases only
involving days away from work (DAFW). For illnesses, only one unrecorded case (restricted
work activity or transfer case) was found by auditors.
Table 9 presents the categories of misrecording of DART cases identified by the auditors.
In the sample of establishments, injury cases only involving restricted work activity or transfer
(RWA) were the type of cases most often misrecorded on the Log as non-DART cases. For
illnesses, only two misrecorded cases were identified by auditors; one case involved DAFW only
and the other RWA only.
Final Report, November 2009
18
Table 7
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Recordable Injury and Illness Cases Identified by Auditors
by Type of Recordkeeping Errors*
Recordable Cases
Type of Recording Error
Number
Percent**
Not Recorded
32 / 577
5.55
DART Recorded as Non-DART
28 / 577
4.85
Non-DART Recorded as DART
6 / 577
1.04
Total Recording Errors (above)
66 / 577
11.44
Total Cases with None of the Above
Errors
Total
511 / 577
88.56
577
100
* The frequencies in this table are unweighted and should not be used to draw broad conclusions
about the recordkeeping audit universe.
** Because of rounding, percentages might not add to 100.
Final Report, November 2009
19
Table 8
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Recordable Injury and Illness Cases Identified by Auditors
and Not Recorded on the Employer’s Log*
Injury/Illness
Category
Injuries
Illnesses
Injuries and
Illness
Combined
Type of Case
Number of Cases Not
Number of Cases
Recorded
Discovered by Auditor
Percent of Category
Not Recorded
Percent of All Cases
Not Recorded
Non-Days Away, Restriction,
or Transfer (DART) Cases
10
148
10 / 148 = 6.76
10 / 32 = 31.25
Days Away From Work
(DAFW) Only
9
125
7.2
28.13
Restricted Work Activity or
Transfer (RWA) Only
8
205
3.9
25
DAFW and RWA
4
75
5.33
12.5
All Types for Injuries (Total)
31
553
5.61
96.88
Non-DART Cases
0
10
0 / 10 = 0
0 / 32 = 0
DAFW Only
0
4
0
0
RWA Only
1
8
12.5
3.13
DAFW and RWA
0
2
0
0
All Types for Illnesses (Total)
1
24
4.17
3.13
Non-DART Cases
10
158
10 / 158 = 6.33
10 / 32 = 31.25
DAFW Only
9
129
6.98
28.13
RWA Only
9
213
4.23
28.13
DAFW and RWA
4
77
5.19
12.5
All Types (Total)
32
577
5.55
100
* The frequencies in this table are unweighted and should not be used to draw broad conclusions about the recordkeeping audit universe.
Final Report, November 2009
20
Table 9
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Number and Percent of Recordable Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Identified by Auditors
and Recorded on the Employer’s Log as Non-DART Cases*
Injury/Illness
Category
Injuries
Illnesses
Injuries and Illnesses
Combined
Number Cases
Recorded as
Non-DART
Cases
Number Cases
Discovered by
Auditor
Percent of
Category Not
Recorded as
DART Case
Percent of All
DART Cases
Recorded as
Non-DART
Cases
Days Away from Work (DAFW)
Only
4
125
4 / 125 = 3.2
4 / 28 = 14.29
Restricted Work Activity or Transfer
(RWA) Only
19
205
9.27
67.86
DAFW and RWA
3
75
4
10.71
All Types for Injuries (Total)
26
405
6.42
92.86
DAFW Only
1
4
1 / 4 = 25
1 / 28 = 3.57
RWA Only
1
8
12.5
3.57
DAFW and RWA
0
2
0
0
All Types for Illnesses (Total)
2
14
14.29
7.14
DAFW Only
5
129
5 / 129 = 3.88
5 / 28 = 17.86
RWA Only
20
213
9.39
71.43
DAFW and RWA
3
77
3.9
10.71
All Types (Total)
28
419
6.68
100
Type of Case
* The frequencies in this table are unweighted and should not be used to draw broad conclusions about the recordkeeping.
Final Report, November 2009
21
Submission Comparison Analysis
Stringent criteria were used for the submission comparison. The analysis considered the
auditors’ comparison of the employers’ injury/illness and hours worked data submitted for the
ODI with the injury and illness data on the Log and the hours worked provided by the employer
at the time of the audit. For this analysis, OSHA used all 241 audits available for the universe
estimate and case-level analysis.
As shown in Table 10, DART cases had the highest percent of establishments with
exactly the same data. For total recordable cases, the audit data were both more and less than the
ODI collection submission for all categories (i.e., there was no pattern to the differences). For
DART cases where audit data differed from submitted data, there were consistently more
instances where the audit found more cases than were on the Log (as opposed to fewer) for all
establishment size categories. The “all small” category had the highest percentages of
establishments with the same number of total recordable and DART cases for both the ODI
submission and the onsite Log.
As shown in Table 11, the percent for all establishments with the same data for hours
worked—submitted for the ODI and provided by the employer at the time of the audit—was
similar to the results in the comparison on type of cases. The audits found more hours worked for
firms in the “medium” category than for the other size groups.
Table 12 indicates that non-manufacturing establishments had a higher percentage of
establishments with data that matched exactly for DART cases and for total recordable cases
than for manufacturing establishments. For hours worked, however, manufacturing
establishments had a higher percentage of establishments with data that matched than for nonmanufacturing, as shown in Table 13.
Final Report, November 2009
22
Table 10
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Results of the Comparison of Total Recordable Injury and Illness Cases and
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Submitted to OSHA for the Data Collection
with Data on the Employer’s Log as Found During Audits by Establishment Size
Establishment
Size Group
(average number
of employees)
All Small (40-99)
(90
establishments)
Small (40-49)*
(12
establishments)
Medium (100249)
(114
establishments)
Large (≥ 250)
(37
establishments)
ALL SIZES
(241
establishments)
Establishment Comparison Results
Total Recordable Injury and Illnesses Cases
Audit Less
Audit Same
DART Injury and Illness Cases
Audit More
Audit Less
Audit Same
Audit More
Number Percent Number Percent Number Percent Number Percent Number Percent Number Percent
6
6.67
78
86.67
6
6.67
2
2.22
82
91.11
6
6.67
2
16.67
8
66.67
2
16.67
0
0
9
75
3
25
6
5.26
93
81.58
15
13.16
6
5.26
97
85.09
11
9.65
4
10.81
26
70.27
7
18.92
3
8.11
29
78.38
5
13.51
16
6.64
197
81.74
28
11.62
11
4.56
208
86.31
22
9.13
* The “small” size group is a subset of the “all small” size group.
Final Report, November 2009
23
Table 11
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Results of the Comparison of Hours Worked Data Submitted to OSHA for the Data Collection with
Hours Worked Provided During Recordkeeping Audits by Establishment Size
Establishment Comparison Results
Establishment Size Group
(average number of employees)
Hours Worked
Audit Less
Audit Same
Audit More
Number
Percent
Number
Percent
Number
Percent
All Small (40-99)*
(90 establishments)
10
11.11
70
77.78
10
11.11
Small (40-49)**
(12 establishments)
1
8.33
10
83.33
1
8.33
Medium (100-249)
(114 establishments)
13
11.4
84
73.68
17
14.91
Large (≥250)
(37 establishments)
1
2.7
31
83.78
5
13.51
ALL SIZES
(241 establishments)
24
9.96
185
76.76
32
13.28
* The “small” size group is a subset of the “all small” size group.
Final Report, November 2009
24
Table 12
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Results of the Comparison of Total Recordable Injury and Illness Cases and
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Submitted to OSHA for the Data Collection with
Data on the Employer’s Log as Found During Recordkeeping Audits
by Industry Type (Manufacturing vs. Non-Manufacturing)
Establishment Comparison Results
Total Recordable Injury and Illnesses Cases
Industry Type
Audit Less
Audit Same
DART Injury and Illness Cases
Audit More
Audit Less
Audit Same
Audit More
Number Percent Number Percent Number Percent Number Percent Number Percent Number Percent
All Manufacturing SICs
(148 establishments)
11
7.43
119
80.41
18
12.16
8
5.41
126
85.14
14
9.46
All Non-Mfg SICs
(93 establishments)
5
5.38
78
83.87
10
10.75
3
3.23
82
88.17
8
8.6
ALL SIZES
(241 establishments)
16
6.64
197
81.74
28
11.62
11
4.56
208
86.31
22
9.13
Final Report, November 2009
25
Table 13
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Results of the Comparison of Hours Worked Data Submitted to OSHA for the Data Collection with
Hours Worked Provided During Recordkeeping Audits
by Industry Type (Manufacturing vs. Non-Manufacturing)
Establishment Comparison Results
Hours Worked
Industry Type
Audit Less
Number
Percent
Audit Same
Number
Percent
Audit More
Number
Percent
All Manufacturing SICs
(148 establishments)
9
6.08
119
80.41
20
13.51
All Non-Mfg SICs
(93 establishments)
15
16.13
66
70.97
12
12.9
ALL SIZES
(241 establishments)
24
9.96
185
76.76
32
13.28
Final Report, November 2009
26
As found in past analyses, there are a variety of reasons why the two datasets may differ.
Tables 14 and 15 display the reasons for differences in case counts and hours worked,
respectively. Changes or corrections to the Log after submission to the ODI accounted for
differences in case counts in over 36 percent of the establishments. Clerical errors (e.g. typing
errors) accounted for another 18 percent. Differences of these types do not necessarily indicate
inaccuracy of the data maintained by the employer or submitted to the Agency.
For hours worked, the primary reasons provided to explain differences were: (1) the
number of hours was estimated rather than calculated for the submission, and (2) the submission
included errors associated with omitting hours worked by certain employee groupings (e.g.,
temporary labor or salaried employees).
Many of the differences observed were fairly small. Taking into account all of the
differences, 1 establishment would have changed targeting category relative to the primary
inspection list for OSHA’s Site-Specific Targeting (SST) Program, which is based on either the
DART injury and illness rate or the days away from work (DAFW) injury and illness rate of
establishments as calculated from the ODI data. Specifically, an establishment would have
moved out of the primary list for the high-rate targeting program onto the secondary list. (OSHA
maintains both a secondary and a tertiary inspection list for establishments that are considered a
lesser priority based on lower thresholds for these rates.)
Final Report, November 2009
27
Table 14
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Primary Reasons for Differences Between the Injury and Illness Data
Submitted to OSHA for the Data Collection and Injury and Illness Data
on the Employer’s Log Provided During the Recordkeeping Audits
Reason(s) Given for Difference(s) in Injury and Illness Data
Log change(s) or correction(s) made after the data were submitted,
reflecting new information brought to the attention of recordkeeper(s)
pertaining to cases on the Log
Other reasons
Clerical error(s) (e.g., typo or transposition)
Checkmark error(s)
Error(s) associated with reporting data from the wrong facility or
facilities
Addition error(s)
Survey processing edit(s) (employer’s Log was otherwise the same as the
submitted data)
Error(s) associated with omitting reporting components (e.g., temporary
labor, salaried employees)
Blank or auditor could not determine reason
Establishment Totals***
Primary Reason
for Difference*
Number
Percent
**
20
36.36
11
10
5
20.00
18.18
9.09
5
9.09
2
3.64
1
1.82
1
1.82
0
55
0.00
100
* The audit software also provides fields for noting any secondary reasons given to explain the differences. This
analysis considers only the primary reasons.
** Because of rounding, percentages might not add to 100.
*** Although 55 establishments provided a primary reason for a difference (as noted in this table), the difference
resulted in a change in total recordable injury and illnesses case counts for only 44 establishments (see total of Audit
Less and Audit More for Total Recordable Injury and Illnesses Cases in Table 10). In the 11 instances where there
was no impact on the total case count, Log column differences in effect canceled each other out.
.
Final Report, November 2009
28
Table 15
OSHA Audits on CY 2006 Injury and Illness Recordkeeping:
Primary Reasons for Differences Between the Data on Hours Worked
Submitted to OSHA for the Data Collection and Data on Hours Worked
Provided During the Recordkeeping Audits
Reason(s) Given for Difference(s) in Hours Worked Data
Primary Reason for
Difference*
Number
Percent**
Estimated value instead of actual value
19
36.54
Error(s) associated with omitting reporting components (e.g.,
temporary labor, salaried employees)
14
26.92
Other reasons
14
26.92
Error(s) associated with reporting from wrong facility or facilities
5
9.62
Blank or auditor could not determine reason
0
0.00
Establishment Totals**
52
100
* The audit software also provides fields for noting any secondary reasons given to explain the differences. This analysis
considers only the primary reasons.
** Because of rounding, percentages might not add to 100.
Final Report, November 2009
29
SUMMARY AND RECOMMENDATIONS
Summary
This analysis represents the eleventh year of OSHA’s audit program on employer injury and
illness recordkeeping. The audit program is well established and the protocol operates efficiently.
Across all of the years of the program, a number of findings remain consistent:
•
Based on the estimates of the accuracy of employer injury and illness recordkeeping, the
OSHA Log and employment data collected through the ODI represent reasonable quality
for OSHA’s targeting and performance measurement purposes.
•
Both some overrecording and underrecording are observed.
•
Underrecording errors are not widely distributed across the sample of establishments. A
small number of establishments account for most of the underrecorded cases.
•
Differences found in comparing the audit data with the data submitted to OSHA result in
very few changes of the inspection targeting category status of establishments.
Findings this year on the CY 2006 employer injury and illness recordkeeping are:
•
Audits available for the analysis. After following the sample selection steps, a total of
241 audits were available for use in the universe estimates, case-level analysis, and
comparison of onsite and submitted data. (This year OSHA did not reach the
methodology’s target of 250 establishment audits available for conducting the analysis.)
•
Distribution of audit establishments across the standard universe. Overall the sample
of audited establishments appears representative of the standard ODI universe by industry
at the 2-digit SIC level, reflecting the effect of implicit stratification.
•
Recordkeeping accuracy universe estimates. Generalizing from the sample of
establishments audited for CY 2006 recordkeeping, the percent of establishments
classified with accurate recordkeeping (at-or-above the 95 percent threshold) for the
standard ODI universe is above 96 percent for both total recordable injury/illness cases
(98.34%, SE 0.82 %) and DART injury/illness cases (96.27%, SE 1.22%). Further, based
on 95 percent confidence intervals for the two estimates, the percentages for total
recordable and DART are not statistically different.
As a separate step in the universe estimates analysis, OSHA also examines the
overrecording of cases (i.e., cases found on the employer’s Log that the auditor has
determined are non-recordable based on a review of employee records during the audit).
Overall, this year’s results on the overrecording of cases are consistent with the level
observed previously.
Final Report, November 2009
30
Recommendations
1. OSHA should continue the audit program with its established process as a quality control
mechanism to ensure that the acceptable level of accuracy in employer injury/illness
recordkeeping for the ODI data collection is maintained.
2. OSHA should continue to use the information from the audit analysis in outreach efforts
to promote improvements in employer injury and illness recordkeeping, with an
emphasis on the correct recording of DART cases. For example, this report or summaries
of the findings should be made available to Agency compliance officers conducting the
recordkeeping audits. In addition, this information should be provided to compliance
officers conducting the recordkeeping inspections under the Injury and Illness
Recordkeeping National Emphasis Program (RK NEP), since these inspections follow
procedures very similar to the protocol for the audit program.
3. A further refinement OSHA should consider is to shift from SIC codes to the North
American Industrial Classification System (NAICS) for compiling the standard universe
that the Agency uses for selecting the annual sample of audit establishments.
Consideration of this change should involve assessing possible effects on the audit
program methodology.
4. In keeping with a recent recommendation from the Government Accountability Office
(GAO) to make the optional employee interview component of the audit process
mandatory, OSHA should consider potential issues associated with conducting worker
interviews about possible past events (such as employee turnover and possible memoryeffects biases) that present challenges for data accuracy. For example, OSHA should
consider interviewing workers about any injuries or illnesses that occurred in both the
reference year and in the most recent calendar year, respectively. The compliance officer
would then include a review of the Log for the most recent year if any incidents were
identified in the interviews.
Final Report, November 2009
31
Appendix A
List of OSHA Data Initiative Collection Quality Reports
and Related Studies
The following analyses have been conducted on OSHA’s audit program:
•
OSHA Data Collection Validation Study: Pilot Test on the Data Collection Quality
and Verification of Employer Injury and Illness Records. September 12, 1997 (Final
Report). Eastern Research Group, Inc. (Contract No. J-9-F-3-0043: Task Order No. 5,
Option Year Two.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1996
Employer Injury and Illness Recordkeeping. September 17, 1998 (Final Report). The
Lexington Group, Eastern Research Group, Inc., and the National Opinion Research
Center. (Contract No. J-9-F-7-0043: Task Order No. 7, Base Year.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1997
Employer Injury and Illness Recordkeeping. August 23, 1999 (Final Report). Eastern
Research Group, Inc. and the National Opinion Research Center. (Contract No. J-9-F7-0053: Task Order No. 7, Option Year One.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1998
Employer Injury and Illness Recordkeeping. September 29, 2000 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 17, Option Year Two.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1999
Employer Injury and Illness Recordkeeping. September 28, 2001 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 24, Option Year Three.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2000
Employer Injury and Illness Recordkeeping. September 27, 2002 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 33, Option Year Four.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2001
Employer Injury and Illness Recordkeeping. December 5, 2003 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-3-0015: Task Order No. 1, Base Year.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2002
Employer Injury and Illness Recordkeeping—Interim Year Analysis in Multi-Year
Reporting Cycle. September 30, 2005 (Final Report). Eastern Research Group, Inc.
Final Report, November 2009
A-1
and the National Opinion Research Center. (Contract No. J-9-F-3-0015: Task Order
No. 2, Option Year 1.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2003
Employer Injury and Illness Recordkeeping—Reporting Year Analysis in Multi-Year
Reporting Cycle. September 7, 2006 (Final Report). Eastern Research Group, Inc. and
the National Opinion Research Center. (Contract No. J-9-F-3-0015: Task Order No.
5, Option Year 2.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2004
Employer Injury and Illness Recordkeeping—Summary Report Year. June 18, 2007
(Final Report). Eastern Research Group, Inc. and the National Opinion Research
Center. (Contract No. J-9-F-3-0015: Task Order No. 11, Option Year 3.)
•
OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2005
Employer Injury and Illness Recordkeeping. September 5, 2008 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-3-0015: Task Order No. 22, Option Year 4.)
Studies related to ODI collection quality include the following:
•
Sample Design for a Statistically Valid Evaluation of Accuracy and Completeness of
an Establishment’s OSHA-Mandated Employee Records. 1996. The National Opinion
Research Center.
•
OSHA Data Collection Validation Study: Initial Assessment of the Accuracy of the
OSHA-Collected Data—An Analysis of the Data Edit Reports and a Review of State
Agency Impressions. February 1997 (Final Report). Eastern Research Group, Inc.
(Contract No. J-9-F-3-0043: Task Order No. 5, Option Year Two.)
•
OSHA Data Collection Validation Study: Descriptive Characteristics of the 1995
OSHA-Collected Data and Comparison with the Bureau of Labor Statistics’ Annual
Survey on Occupational Injuries and Illnesses. September 12, 1997 (Final Report).
Eastern Research Group, Inc. (Contract No. J-9-F-3-0043: Task Order No. 5, Option
Year Two.)
•
OSHA Data Collection Validation Study: Issues with Creating a Matched File for
Comparing the OSHA 200 Log Data Collected by Compliance Officers During Onsite
Interventions with the Injury/Illness Data from the OSHA Log Data Collection.
September 12, 1997 (Final Report). Eastern Research Group, Inc. (Contract No. J-9F-3-0043: Task Order No. 5, Option Year Two.)
•
A Summary of Findings on the Correlation of Establishment Injury/Illness Rate Data
from the OSHA Data Initiative and the IMIS Log Data. September 25, 2000 (Final
Report). The Lexington Group, Eastern Research Group, Inc., and Dr. Wayne Gray.
(Contract No. J-9-F-7-0043: Task Order No. 23, Subtask 1, Option Year Two.)
Final Report, November 2009
A-2
•
A Summary of Findings on the Correlation of Establishment Injury/Illness Rate Data
from the OSHA Data Initiative and the BLS Annual Survey. September 25, 2000
(Final Report). The Lexington Group, Eastern Research Group, Inc., and Dr. Wayne
Gray. (Contract No. J-9-F-7-0043: Task Order No. 23, Subtask 2, Option Year Two.)
Final Report, November 2009
A-3
Appendix B
Background on the OSHA Injury and Illness Recordkeeping Audit Program
Program-Related Analyses and Key Findings
As an initial step in assessing the quality of information compiled by OSHA’s Data
Initiative (ODI) collection system, the Agency conducted two data validation studies in 1996:
C
An analysis of the data collection system’s edit criteria results and commentary on
data quality from State agencies assisting in the collection effort.
C
Calculation of descriptive statistics on the collected data and comparison of the data
with injury and illness data from the BLS Annual Survey.
Findings from the studies indicated that OSHA had implemented a credible system to provide the
Agency with useful, establishment-specific data on occupational injuries and acute illnesses.
At the same time, the studies underscored the need for OSHA to continue efforts to
ensure the quality of the OSHA-collected data. Under the audit program, OSHA conducts onsite
audits of employer injury and illness records to verify the overall accuracy of source records,
estimate the extent of employer compliance with the OSHA recordkeeping requirements defined
in 29 CFR 1904, and assess the consistency between data on the employer’s Log and data
submitted to the Agency under the ODI.
In 1997, OSHA conducted an audit pilot program in nine establishments to test the
Agency’s protocol designed for efficient use of resources in performing recordkeeping audits.
The protocol is designed to save auditors time through the review of records for a statistical
sampling of employees within an establishment and through use of the OSHA Recordkeeping
Audit Assistant (ORAA) software system for streamlining the process of conducting,
documenting, tracking, and analyzing the establishment audit.
Overall, OSHA’s analysis of the pilot test, which reviewed calendar year (CY) 1995
records, demonstrated the feasibility of the protocol for use in a larger audit program. In 1998,
based on its experience with the pilot test, the Agency modified the protocol slightly for use in
the first full-scale program for auditing employer injury and illness records. (That first year
involved audits on CY 1996 records.) Similarly, for the next five years of the audit program,
OSHA drew upon its earlier experience and made minor adjustments in implementation of the
program for audits on establishments’ CY 1997, 1998, 1999, 2000, and 2001 records,
respectively.
Final Report, November 2009
B-1
In summary, OSHA’s analyses of the first six years of the audit program found the
following:
C
The sample of establishments audited was representative of the sampling universe.
C
The audit protocol, including sampling of employees within establishments, appears
to provide OSHA with a feasible process to monitor the quality of employer injury
and illness recordkeeping.
C
The estimates of overall accuracy for total recordable and lost workday cases (i.e.,
establishments at-or-above the 95 percent threshold) suggest that the ODI collection
currently provides reasonably accurate data that OSHA can use to help meet its
program and performance measurement data needs. Related findings include:
o
The percent of establishments with injury/illness recordkeeping determined to be
at-or-above the threshold of accuracy has increased.
o
Errors are not widely distributed across the sample establishments. A small
number of establishments account for most of the underrecorded cases.
o
Both overrecording and underrecording are observed.
o
Differences found in comparing the audit data with the data submitted to OSHA
result in very few changes of the targeting category status of establishments for
inspections.
o
There is no evidence that small establishments have less accurate injury/illness
records than medium or large size establishments.
The sixth year of the audit program marked the last analysis of injury/illness
recordkeeping under the old version of 29 CFR 1904. Subsequent annual audit program cycles
focus on records maintained by employers under the revised rule, which went into effect on
January 1, 2002. The intention of the revisions made to the recordkeeping requirements is to
simplify injury/illness recordkeeping for employers and contribute to the quality of establishment
injury/illness data.
The seventh year of the audit program focused on CY 2002 injury/illness recordkeeping
and provided a preliminary review of accuracy in non-construction establishments under the first
year of the revised recordkeeping rule. The annual analysis indicated that recordkeeping
accuracy was not significantly different than the results found in past years under the old rule.
Final Report, November 2009
B-2
Highlights of Annual Recordkeeping Audits and Analyses over the First Seven Years
Second Year of Program (Audits on CY 1997 Recordkeeping). Notable differences in
implementation of the second-year audit program included expanding the audit universe beyond
the Federal OSHA jurisdiction to include establishments in State Plan States. Also, before
selecting a sample of audit establishments, OSHA implemented implicit stratification of the
universe by first sorting establishments on Standard Industrial Classification (SIC) code,
followed by OSHA Region, and last by employment size. This approach is designed to provide
sample establishments in similar proportions to their SIC, geographic, and size distribution in the
universe. Compared to a simple random sampling approach, implicit stratification distributes the
audit workload among the OSHA Regions better and balances the industry (manufacturing vs.
non-manufacturing/non-construction) and establishment size distributions for the analysis.
Third Year of Program (Audits on CY 1998 Recordkeeping). In the third year of the
audit program, OSHA began to explore the use of a standard sampling universe to facilitate
comparison of year-to-year estimates. OSHA also increased the number of establishments in the
audit sample and the number of assigned audits in order to increase the likelihood that the
number of audits available for analysis would be closer to the approximate target of 250.
Additionally, the third-year audit program’s coverage was expanded by including establishments
with an average employment between 40 and 49 (compared to the previous cut-off at 50 in 1997
and 60 in 1995 and 1996) and by encouraging a greater number of State Plan States to
participate.
Fourth Year of Program (Audits on CY 1999 Recordkeeping). For the fourth-year audit
program, OSHA modified its approach for selecting audit establishments from a universe of
establishments participating in the ODI in a specific year. Instead, OSHA selected a sample from
a standard ODI universe that covered all years of the ODI. OSHA’s objective in sampling from a
standard ODI universe was to establish a credible basis for generalizing the estimate of overall
accuracy for an individual year’s employer injury and illness recordkeeping to ODI
establishments nationwide. Additionally, use of a standard universe would anticipate the benefit
of conducting year-to-year comparisons to assess recordkeeping under the new rule.
In the first four years of the program, the analysis found that about 90 percent of
establishments in the sampling universe for the specific year were estimated as having accurately
recorded the number of total recordable cases; about 88 percent of establishments were found to
be accurate in recording lost workday cases.
Fifth Year of Program (Audits on CY 2000 Recordkeeping). For the fifth-year audit
program, OSHA selected a sample for a second time from a standard ODI universe that covered
all years of the ODI. This enabled OSHA to include a preliminary comparison of recordkeeping
accuracy estimates for ODI establishments nationwide. The comparison indicated consistency in
recordkeeping accuracy estimates across the fourth and fifth years of the audit program.
Interpretation of the comparison was limited somewhat because OSHA had further refined the
definition of the standard ODI universe in the fifth year by excluding two industries that are only
selectively included in the ODI. Nonetheless, the preliminary comparison provided potential
baseline data for using such comparisons to assess recordkeeping under the new rule.
Final Report, November 2009
B-3
Sixth Year of Program (Audits on CY 2001 Recordkeeping). For the sixth-year audit
program, OSHA again selected a sample from a standard ODI universe that covered all years of
the ODI. The analysis found consistency between CY 2000 and CY 2001 recordkeeping
accuracy estimates for ODI establishments nationwide. Further, in applying a statistical test to
the comparison of accuracy estimates, OSHA found no significant difference in the means for
the two years, suggesting overall recordkeeping improvement. (An additional, minor refinement
to the standard ODI universe should be noted regarding this year-to-year comparison; i.e., for the
sixth year program, the Agency included SIC 43 (U.S. Postal Service)—now under OSHA
jurisdiction—in the universe, which added 297 facilities.) In the fifth and sixth years of the
program, the analysis found that about 95 percent of establishments in the sampling universe
were estimated as having accurately recorded the number of total recordable cases; about 93
percent of establishments were found to be accurate in recording lost workday cases.
Also for the sixth year’s audit program, OSHA conducted a pilot test of audits at a
sample of construction firms using a protocol that addressed issues specific to the construction
industry and its operation of “short-term establishments.” OSHA selected a sample of
construction audit establishments from a universe of about 9,000 establishments that had
submitted complete data for the CY 2001 ODI collection and met relevant criteria (e.g., operate
under one of the three 2-digit construction SIC codes).
In analyzing the results of pilot audits on establishments in construction industries,
OSHA implemented the same general approach used for audits at non-construction
establishments. Overall, the analysis found a slightly lower percent of construction
establishments at-or-above the threshold of accuracy for both total recordable and lost workday
cases in comparison to the accuracy estimates for non-construction establishments. While the
construction pilot findings indicate that the audit methodology developed for non-construction
establishment can be implemented in construction establishments, unique aspects of construction
operations require allowances for flexibility in maintaining records, which yield a mix of
recordkeeping audits that vary in terms of establishment scope. Because of fundamental
differences in the recordkeeping procedures between the construction and non-construction
industries, if OSHA continues collecting data from construction SICs, it is recommended that the
ODI construction universe and audit analysis remain separate from the non-construction analysis.
Seventh Year of Program (Audits on CY 2002 Recordkeeping). For the seventh-year
audit program, OSHA again selected a sample from a standard ODI universe. This analysis on
CY 2002 recordkeeping provided a preliminary review of injury/illness recordkeeping accuracy
in non-construction establishments under the first year of employer implementation of OSHA’s
revised recordkeeping rule. The study indicated that recordkeeping accuracy in the first year
under the revised recordkeeping rule is not significantly different than the results found in past
years under the old rule. (Note that for calendar years before 2002, “accuracy” refers to
recordable cases recorded on the Log 200 or lost workday cases recorded on the Log as lost
workday cases. As of CY 2002, with implementation of the revised recordkeeping rule,
“accuracy” refers to recordable cases recorded on the Log 300 or DART cases recorded on the
Log as DART cases.)
Final Report, November 2009
B-4
Also in the seventh year of the program, OSHA established a multi-year analysis cycle
for audits on employer recordkeeping that includes interim and comprehensive analyses. OSHA
is no longer required to report annually on its monitoring of ODI data quality to OMB. Although
OSHA will continue to conduct annual recordkeeping audits, it will now report every third year
to OMB in conjunction with the Agency’s request for clearance to continue the annual ODI data
collection. For the non-reporting year(s) of a multi-cycle, OSHA will conduct only a summary
analysis of the annual audit program. The analysis on CY 2002 records addressed an interim year
of the OMB reporting cycle.
Final Report, November 2009
B-5
Appendix C
OSHA Instruction: Audit and Verification Program of
Occupational Injury and Illness Records
Directive Number: CPL_02-00-138
Final Report, November 2009
C-1
Appendix D
Tracking Status Codes Used in
Processing CY 2006 ODI Submissions
Distribution and Collection Status Codes
BLANK
ML
CI
ES
NRM
NRC
OTM
PO
PRM
Establishment record (address information only) in the database
Mailed form
Checked in form returned from establishment
Electronically submitted data by establishment
Nonresponse form mailed
Nonresponse telephone call made
Optional third mailing of form
Post office return
Remailed form to corrected address
Processing Status Codes
DE1
COMP
ECRG
Primary data entry
Secondary data entry and data compared
Edit condition report generated
Final Status Codes
OK
FD
UNR
NC
DU
OB
OS
OO
OKOS
OKPD
PHD
RU
UM
Data are complete and accurate
Final data for business that has ceased operations
State determined information is unreliable
Noncompliant establishment
Duplicate form
Out of business
Out of scope
Only office/sales staff at establishment
Data are complete and accurate but out of scope
Data are complete and accurate—partial year data
Phone disconnected
Records unavailable
Unmailable, no new address found
Final Report, November 2009
D-1
File Type | application/pdf |
File Title | Microsoft Word - RKQA_on06_FnlRpt.doc |
Author | JBergin |
File Modified | 2009-11-25 |
File Created | 2009-11-25 |