Download:
pdf |
pdfAppendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
APPENDIX A
METHODOLOGY FOR THE 2009 GENERAL AVIATION
AND PART 135 ACTIVITY SURVEY
Purpose of Survey
The General Aviation and Part 135 Activity (GA) Survey provides the Federal Aviation
Administration (FAA) with information on general aviation and on-demand Part 135 aircraft
activity. The survey enables the FAA to monitor the general aviation fleet so that it can
anticipate and meet demand for National Airspace System (NAS) facilities and services, assess
the impact of regulatory changes on the fleet, and implement measures to assure the safe
operation of all aircraft in the NAS. The data are also used by other government agencies, the
general aviation industry, trade associations, and private businesses to identify safety problems
and to form the basis for research and analysis of general aviation issues.
Background and History
Prior to the first implementation of the annual GA Survey in 1978, the FAA used the Aircraft
Registration Eligibility, Identification, and Activity Report (AC Form 8050-73) to collect data on
general aviation activity. The form was sent annually to all owners of civil aircraft in the United
States and served two purposes: a) Part 1 was the mandatory aircraft registration revalidation
form; and b) Part 2 was voluntary and applied to general aviation aircraft only, asking questions
on the owner-discretionary characteristics of the aircraft such as flight hours, avionics
equipment, base location, and use. The FAA used this information to estimate aircraft activity.
In 1978, the FAA replaced AC Form 8050-73 with a new system. Part 1 was replaced by a
triennial registration program. In January 1978, the FAA implemented a new procedure, known
as triennial revalidation, for maintaining its master file. Instead of requiring all aircraft owners to
revalidate and update their aircraft registration annually, the FAA only required revalidation for
those aircraft owners who had not contacted the FAA Registry for three years. This less
frequent updating affected the accuracy and representation in the master file: a) the accuracy of
information about current owners and their addresses deteriorated; and, b) the master file
retained information on aircraft that would have been re-registered or purged from the file under
the previous revalidation system.
Part 2 of AC Form 8050-73 was replaced by the General Aviation Activity Survey. Conducted
annually, the survey was based on a statistically selected sample of aircraft, and it requested
the same type of information as Part 2 of AC Form 8050-73. The first survey took place in 1978
and collected data on the 1977 general aviation fleet.
In 1993, the name of the survey was changed to the General Aviation and Air Taxi Activity
Survey to reflect that the survey included air taxi (that is, on-demand Part 135) aircraft. Starting
in 1999, information about avionics equipment, which had been collected only every other year,
was requested every year. As a result, the survey’s name was changed to the General Aviation
and Air Taxi Activity and Avionics Survey. In 2006, “Part 135” replaced the term “Air Taxi” in the
survey title, the word “Avionics” was removed (though avionics data were still collected
annually), and the survey was named the General Aviation and Part 135 Activity Survey. This is
A-1
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
the name under which the 2009 survey was conducted. The 2009 statistics in this report were
derived from the thirty-second GA Survey, which was implemented in 2010.
The GA Survey has undergone periodic revisions to content, implementation, and definition of
the GA population in order to remain current with regulations, activity patterns, and aviation
technology. Tables A.1 through A.3 summarize changes in the form or content of the survey, the
protocol for collecting data, and the sample design, including changes in how the survey
population is defined.
Table A.1: Changes in Form or Content of Survey Questionnaire, by Survey Year
Year
Change in form or content of survey questionnaire
1993
Added sightseeing and external load to use categories
1996
Added public use (i.e., flights for the purpose of fulfilling a government function) to use
categories
Significant re-design of the entire survey form to reduce item non-response, add new content,
and be compatible with optical scanning
1999
Added air medical services to use categories
Discontinued the use of a catch-all “other” category as used in previous years
Began collecting avionics data every year, rather than every other year
2000
“Public use” asked as a separate question, independent of other use categories (e.g., business
transportation), because it was not mutually exclusive with respect to other flight activity
2002
Use categories refined to be mutually exclusive and exhaustive and match definitions used by
National Transportation Safety Board (NTSB) for accident reporting
2004
Air medical services was divided into two separate types to capture air medical flights under Part
135 and air medical flights not covered by Part 135
A more clearly defined “other” category was reintroduced
Fractional ownership question was changed from yes/no to a percentage
2005
Reduced the number of fuel type response categories by removing obsolete options
Average fuel consumption (in gallons per hour) was added
Revised questions about avionics equipment by adding and rearranging items
Location of aircraft revised to ask the state or territory in which the aircraft was “primarily flown”
st
during the survey year rather than where it was "based” as of December 31 of the survey year.
Percentage of hours flown in Alaska was added
2007
Questions on percentage of hours flown under different flight plans, flight conditions, and
day/night were revised into a single tabular format
Number of types of landing gear systems was expanded
Ice protection equipment was revised and prohibition from flight in icing conditions was added
Questions about avionics equipment were revised to reflect changes in technology
Two questions about avionics equipment were revised:
2009
“Air Bag/Ballistic Parachute” was asked as two items–“Air Bag” and “Ballistic Parachute”
“ADS-B (Mode S)” was separated into two questions–“ADS-B (Mode S) Transmit Only
(Out)” and “ADS-B (Mode S) Transmit and Receive (In)”
A-2
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Table A.2: Changes in Data Collection Methodology, by Survey Year
Year
Change in data collection methodology
1999
Non-respondent telephone survey conducted to adjust active aircraft and hours flown estimates
2000
Discontinued non-respondent telephone survey because of the variability of telephone nonrespondent factors
1
Added Internet response option
2003
Added a reminder/thank-you postcard between the first and second mailings
Introduced “large fleet” summary form to allow owners/operators of multiple aircraft to report
aggregate data for their entire fleet on a single form
2004
2009
Initiated telephone follow-up effort to contact owners/operators of multiple aircraft who had not
responded. (Protocol encourages and facilitates participation by providing alternate forms and
offering technical assistance but survey is not conducted by telephone.)
Initiated telephone follow-up effort to contact owners/operators of individual aircraft who
completed partial survey. (Protocol encourages and facilitates participation by offering technical
assistance but survey is not conducted by telephone.)
Table A.3: Changes in Sample Design or Definition of Survey Population, by Survey Year
Year
Change in sample design or survey population
1993
Number of aircraft types classified by the sample was expanded from 13 to 19
1999
Sample design revised to stratify by aircraft type (19 categories) and FAA region (9 categories)
2003
Aircraft with known incorrect addresses and identified as “Postmaster Return” status on the
Registry were retained in the definition of the survey population and were eligible for selection
into the survey sample
2
Aircraft reported as “registration pending” or “sold” (if sold status less than 5 years ago) on the
Registry were retained in the definition of the survey population and were eligible for selection
into the survey sample
2004
Sample design revised to stratify by aircraft type (19 categories), FAA region (9 categories), and
whether the aircraft is owned by an entity certified to fly Part 135 (2 categories)
Introduced 100 percent sample of the following groups: turbine aircraft, rotorcraft, on-demand
Part 135 aircraft, and Alaska-based aircraft
th
2005
2006
Sample design and reporting revised by introducing light-sport aircraft as a 20 aircraft type
sampled at 100 percent. For purposes of sampling and reporting, “light-sport” included aircraft
with Special or Experimental airworthiness certification as well as light-sport aircraft for which
airworthiness certificates are not yet final.
Sample design simplified by reducing the number of aircraft types to 14 (removed distinctions
based on number of seats and eliminated “Other” subcategories of piston, turboprop, and
turbojet aircraft)3
Sample design included 100 percent sample of aircraft manufactured in the past five years
1
Telephone surveys of non-respondents also were conducted in 1977, 1978, 1979, 1997, and 1998. Please refer to
the 1999 GA Survey report for a full discussion of the telephone survey of non-respondents.
2
Before 1999, the sample was stratified by aircraft type (19 categories) and state/territory (54 categories).
3
Published estimates continue to distinguish aircraft categories by engine type, number of engines, and number of
seats.
A-3
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Year
Change in sample design or survey population
The 100 percent sample of light-sport aircraft was limited to light-sport aircraft with Special
airworthiness certification. All other light-sport aircraft—those with experimental airworthiness
and those with airworthiness certificates that are not final—were sampled at a rate of less than
1.0 but in sufficient numbers to support statistical estimation.
2008
Survey Population and Survey Sample
The survey population for the 2009 General Aviation and Part 135 Activity Survey includes all
civil aircraft registered with the FAA that are based in the US or US territories and that were in
existence and potentially active between January 1 and December 31, 2009. This includes
aircraft operating under:
•
Part 91: General operating and flight rules
•
Part 125: Certification and operations: Airplanes having a seating capacity of 20 or
more passengers or a maximum payload capacity of 6,000 pounds or more (but not
for hire)
•
Part 133: Rotorcraft external load operations
•
Part 135: On-demand (air taxi) and commuter operations not covered by Part 121
•
Part 137: Agricultural aircraft operations.
Aircraft operating under Part 121 as defined in Part 119 are excluded from the survey
population. Foreign air carriers, which operate under Part 129, are also not part of the survey
population. Civil aircraft that are known not to be potentially active during the survey year are
also excluded from the population (i.e., aircraft displayed in museums, aircraft destroyed prior to
January 1, 2009).
The Aircraft Registration Master File, maintained by the FAA’s Mike Monroney Aeronautical
Center in Oklahoma City, Oklahoma, serves as the sample frame or list of cases from which a
sample of civil aircraft is selected. The Registration Master File (“Registry”) is the official record
of registered civil aircraft in the United States. For the purpose of defining the 2009 survey
population, we used the Registry’s list of aircraft as of December 31, 2009.
The Registry, like many sample frames, is an imperfect representation of the survey population.
While it may exclude a small number of aircraft that operate under the FAA regulations
governing the operation of general aviation and on-demand Part 135 aircraft, it also includes
aircraft that are not part of the survey population. Prior to sample selection, several steps are
taken to remove ineligible aircraft from the sample frame. Specifically, this includes removing
the following:
•
aircraft whose registration has been cancelled or revoked
•
aircraft based in Europe or registered to a foreign company that have not returned
flight hour reports
•
aircraft that operate under Part 121
A-4
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
•
aircraft destroyed or moved to museums prior to January 1, 2009
•
aircraft reported sold before 2004 (5 years prior to survey year)4
•
aircraft that are flagged Postmaster Return (known to have incorrect address
information) since before 1999 (10 years prior to survey year)
•
aircraft that lack information necessary to execute the sample design (i.e., aircraft
type, FAA region)5
The Registry included 374,373 aircraft as of December 31, 2009. This represents a decrease of
less than 1 percent (0.47 percent) over the Registry file from 2008 (376,124 records). After
excluding the aircraft described above, 309,811 records remain, which is 82.8 percent of the
Registry as of December 31, 2009. The 2009 survey population of 309,811 represents a
decrease of less than 1 percent (0.6 percent) from 2008 (311,531). The 2009 survey population
as a percentage of all records on the Registry master file is slightly less than the previous year
(82.7 percent compared with 82.8 percent in 2008).
The 2009 GA Survey Sample
The 2009 survey sample design is the same as that for the 2008 survey year.6 The sample is
stratified by aircraft type (15 categories), FAA region in which the aircraft is registered (9
categories), whether the aircraft operates under a Part 135 certificate (2 categories), and
whether the aircraft was manufactured in the past 5 years (2 categories). Aircraft operated
under a Part 135 certificate were identified using the FAA’s Operations Specifications
Subsystem (OPSS) database that was merged with the Registry by N-number. The four
stratifying variables yield a matrix of 540 cells.
We define 15 aircraft types to execute the sample design as shown in Table A.4. The
classification distinguishes among fixed wing aircraft, rotorcraft, experimental aircraft, light-sport,
and other aircraft. Within the major categories of fixed wing and rotorcraft, we differentiate
aircraft by type and number of engines (e.g., piston, turboprop, turbojet, turbine, single- and twoengines). Experimental aircraft are subdivided by amateur-built status and airworthiness
certification, and we classify “other” aircraft as gliders or lighter-than-air. Light-sport is
subdivided into special and experimental based on airworthiness certification. Light-sport aircraft
for which airworthiness certificates are not yet final are included with experimental light-sport.
Prior to the 2006 survey year, we defined 20 aircraft types and distinguished aircraft by size
(number of seats) as well as by type and number of engines and airworthiness. We eliminated
subcategories based on number of seats to increase the efficiency of the sample. We also
eliminated three “other” categories. Improvements in the Registry over the years have left
relatively few aircraft assigned to three residual categories: Fixed wing piston–other, fixed wing
turboprop–other, and fixed wing turbojet–other. Because these categories are relatively small
and unable to support reliable statistical estimates, the aircraft are reassigned to the modal
category in the corresponding larger group. Since 2006, the modal categories have resulted in
the following reassignments: a) from fixed wing piston–other to fixed wing piston–1 engine, 4 or
4
Prior to 2004, aircraft were excluded if reported sold more than one year prior to the survey year.
The number of aircraft missing this information is typically very small.
6
The 2006 survey year initiated changes in the sample design that are retained in 2009. For a discussion, see
“Appendix A: Methodology for the 2006 General Aviation and Part 135 Activity Survey.”
5
A-5
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
more seats; b) from fixed wing turboprop–other to fixed wing turboprop–2 engines, 1-12 seats;
and c) from fixed wing turbojet–other to fixed wing turbojet.
Table A.4 also identifies the aircraft types that are used in reporting survey results. Although we
define 15 aircraft types for the purpose of sampling, statistical estimates are reported for 18
aircraft types that elaborate aggregate groups (e.g., fixed wing piston) by number of engines
and number of seats (e.g., fixed wing piston–2 engines, 1-6 seats). Starting with the 2009 GA
Survey, estimates are reported separately for experimental light-sport and special light-sport
aircraft. Prior to 2009, there were too few light-sport aircraft to support reliable estimates by
experimental and special airworthiness within the light-sport group.
Table A.4: Aircraft Types Used for Sample Design and for Reporting Survey Results
Aircraft Types in the Sample Design
Aircraft Types for Reporting Results
Fixed wing piston (1 engine)
Fixed wing piston (1 engine, 1-3 seats)
Fixed wing piston (2 engines)
Fixed wing piston (1 engine, 4 or more seats)
Fixed wing turboprop (1 engine)
Fixed wing piston (2 engines, 1-6 seats)
Fixed wing turboprop (2 engines)
Fixed wing piston (2 engines, 7 or more seats)
Fixed wing turbojet
Fixed wing turboprop (1 engine)
Rotorcraft (Piston)
Fixed wing turboprop (2 engines, 1-12 seats)
Rotorcraft (Turbine, 1 engine)
Fixed wing turboprop (2 engine, 13 or more seats)
Rotorcraft (Turbine, multi-engine)
Fixed wing turbojet
Glider
Rotorcraft (Piston)
Lighter-than-air
Rotorcraft (Turbine, 1 engine)
Experimental (Amateur)
Rotorcraft (Turbine, multi-engine)
Experimental (Exhibition)
Glider
Experimental (Other)
Lighter-than-air
Light-sport (Experimental)
Experimental (Amateur)
Light-sport (Special)
Experimental (Exhibition)
Experimental (Other)
Light-sport (Experimental)
Light-sport (Special)
A-6
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Aircraft Sampled at 100 Percent
The 2009 survey sample included several types of aircraft that were sampled at a rate of 1.0.
Because of the FAA’s interest in better understanding the operation of these aircraft, all such
aircraft listed in the Registry were included in the survey sample to ensure a sufficient number of
responses to support analysis and provide more precise estimates of fleet size and aircraft
activity. These include:
•
100 percent sample of turbine aircraft (turboprops and turbojets)
•
100 percent sample of rotorcraft
•
100 percent sample of special light-sport aircraft
•
100 percent sample of aircraft operating on-demand Part 135
•
100 percent sample of aircraft based in Alaska7
•
100 percent sample of aircraft manufactured within the past 5 years (since 2004).
Since 2004, the survey design has included 100 percent samples of turbine aircraft, rotorcraft,
aircraft certificated to operate under Part 135, and Alaska-based aircraft. In 2005, we added the
100 percent sample of light-sport aircraft. In 2006, we added the 100 percent sample of
recently-manufactured aircraft. In 2008, we revised the 100 percent sample of light-sport aircraft
to include only special light-sport aircraft. Experimental light-sport and those without final
airworthiness documentation are sampled at less than 100 percent but in sufficient numbers to
support statistical estimates of flight activity. Altogether the aircraft sampled at 100 percent
contributed 62,983 observations to the 2009 survey sample.
Aircraft Sampled at Less than 100 Percent
Aircraft that are not part of a 100 percent sample are subject to selection based on sampling
fractions defined for each cell in the sample design matrix. “Average annual flight hours” is the
primary measure needed by the FAA to address survey goals. Sample fractions for each
sample strata are defined to optimize sample size to obtain a desired level of precision for an
estimate of flight activity. Data from the previous survey year on average hours flown, variability
in hours flown by region and aircraft type, and response rates are used to set precision levels
and identify the optimal sample size for each strata. Aircraft are randomly selected from each
cell in the matrix, subject to the desired sample size. Strata that yield a very small sample size
are examined and adjusted to include all observations in the strata if necessary.
In 2009, an additional 22,103 aircraft were sampled at a rate of less than 1.0. The number of
aircraft sampled at a rate of less than 1.0 is 22 percent greater than the 2008 survey when
18,086 such aircraft were included in the sample. Limiting the 100 percent sample of light-sport
aircraft to those with special airworthiness certificates made it possible to sample other
categories of aircraft at slightly higher rates without exceeding a maximum sample size feasible
within the constraints of the data collection protocol, timeframe, and resources. However, the
7
Alaska-based aircraft are identified by the state listed in the Registry file, not survey data on where the aircraft is
operated.
A-7
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
number of aircraft selected at a rate of less than 1.0 is still lower than it was before the
introduction of multiple 100 percent samples. In 2006, an additional 25,494 aircraft were
sampled at a rate of 1.0, and in the 2005 survey year, an additional 34,667 aircraft were
sampled. The increase in the 100 percent samples has had the greatest impact on fixed wing
piston aircraft, where sampling fractions are small relative to population size. Other categories,
such as “other aircraft” and experimental aircraft may have a smaller absolute number of aircraft
selected into the sample, but the sampling fractions are relatively high because almost all
available aircraft are needed to populate the sample design.
The 2009 GA Survey sample included 85,086 aircraft. Table A.5 summarizes the population
counts and sample sizes by aircraft type.
A-8
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Table A.5: Population and Survey Sample Counts by Aircraft Type
Population
211,932
Sample Size
28,962
Sample as
Percent of
Population
13.7
62,067
127,780
14,953
7,132
6,070
14,970
5,060
2,862
9.8
11.7
33.8
40.1
9,965
4,293
4,653
1,019
9,965
4,293
4,653
1,019
100.0
100.0
100.0
100.0
Fixed Wing - Turbojet
2 engines
12,586
12,586
12,586
12,586
100.0
100.0
Rotorcraft
Piston
Turbine (1 engine)
Turbine (multi-engine)
12,723
5,246
5,819
1,658
12,723
5,246
5,819
1,658
100.0
100.0
100.0
100.0
9,802
3,121
6,681
5,105
1,922
3,183
52.1
61.6
47.6
Experimental
Amateur
Exhibition
Other
42,666
37,279
3,140
2,247
10,723
6,612
1,978
2,133
25.1
17.7
63.0
94.9
Light-sport
Experimental light-sport*
Special light-sport
10,137
8,570
1,567
5,022
3,455
1,567
49.5
40.3
100.0
309,811
85,086
27.5
Aircraft Type
Fixed Wing - Piston
1 engine, 1-3 seats
1 engine, 4+ seats
2 engines, 1-6 seats
2 engines, 7+ seats
Fixed Wing - Turboprop
1 engine
2 engines, 1-12 seats
2 engines, 13+ seats
Other Aircraft
Glider
Lighter-than-air
Total
*Experimental light-sport includes light-sport aircraft with experimental airworthiness certificates
as well as light-sport aircraft for which airworthiness certification is not final.
Weighting the Survey Data
Data from completed surveys are weighted to reflect population characteristics. The weights
reflect the proportion of aircraft sampled from the population in each sample strata and
A-9
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
differential response as well as a small adjustment for aircraft that are not part of the survey
population.
Initially, each aircraft for which we receive a completed survey is given a weight that reflects
sampling fraction and differential response. That is:
WEIGHT = (Population Nijkl/Sample Nijkl) * (N Respondentsijkl/Sample Nijkl)
where i, j, k, and l represent the four sample strata of aircraft type, FAA region, Part 135 status,
and whether an aircraft was manufactured in the past 5 years.
The weight is subsequently adjusted to reflect new information about non-general aviation
aircraft. That is, survey responses that identify an aircraft as not being part of the survey
population—destroyed prior to January 1, 2009; displayed in a museum; operated primarily as
an air carrier under Part 121 or 129; or a military aircraft—are used to remove aircraft
proportionally from the sample and from the population. This adjustment is done at the level of
the 15 aircraft types. The procedure assumes that non-GA aircraft occur in the same proportion
among survey respondents and non-respondents. To the extent that non-GA aircraft are less
likely to receive and complete a survey, this approach will underestimate the adjustment for
aircraft that are not part of the general aviation population.
Errors in Survey Data
Errors associated with survey data can be classified into two types—sampling and nonsampling errors. Sampling errors occur because the estimates are based on a sample of aircraft
rather than the entire population and we can expect, by chance alone, that some aircraft
selected into the sample differ from aircraft that were not selected.
Non-sampling errors can be further subdivided into a) errors that arise from difficulties in the
execution of the sample (e.g., failing to obtain completed interviews with all sample units), and
b) errors caused by other factors, such as misinterpretation of questions, inability or
unwillingness to provide accurate answers, or mistakes in recording or coding data.
Sampling Error
The true sampling error is never known, but in a designed survey we can estimate the potential
magnitude of error due to sampling. This estimate is the standard error. The standard error
measures the variation that would occur among the estimates from all possible samples of the
same design from the same population.
This publication reports a standard error for each estimate based on survey sample data. An
estimate and its standard error can be used to construct an interval estimate (“confidence
interval”) with a prescribed level of confidence that the interval contains the true population
figure. In general, as standard errors decrease in size we say the estimate has greater precision
(the confidence interval is narrower), while as standard errors increase in size the estimate is
less precise (the confidence interval is wider). Table A.6 shows selected interval widths and
their corresponding confidence.
A-10
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Table A.6: Confidence Interval Estimates
Width of interval
Approximate confidence
that interval includes
true population value
1 Standard error
68%
2 Standard error
95%
3 Standard error
99%
This report presents a “percent standard error” for each estimate, which is the standard error
relative to the mean. The percent standard error is the ratio of the standard error to its estimate
multiplied by 100. For example, if the estimate is 4,376 and the standard error is 30.632, then
the percent standard error is (30.632/4,376) x 100 = 0.7. Reporting percent standard errors
makes it possible to compare the precision of estimates across categories.
Estimates and percent standard errors reported in Table 2.1 in Chapter 2 ("Population Size,
Active Aircraft, Total Flight Hours, and Average Flight Hours by Aircraft Type") provide an
example of how to compute and interpret confidence intervals. To obtain a 95 percent
confidence interval for the estimated number of total hours flown for twin-engine turboprops in
2009, where the total hours flown is estimated to be 1,148,663 and the percent standard error of
the estimate is 2.1, the following computation applies:
Lower confidence limit: 1,148,663 – 1.96(2.1/100)(1,148,663) = 1,101,384
Upper confidence limit: 1,148,663 + 1.96(2.1/100)(1,148,663) = 1,195,942
In other words, if we drew repeated samples of the same design, 95 percent of the estimates of
the total hours flown by twin-engine turboprops would fall between 1,101,384 and 1,195,942.
Non-sampling Error
Sampling error is estimable and can be reduced through survey design (e.g., by increasing
sample size), but it is difficult, if not impossible, to quantify the amount of non-sampling error.
Although extensive efforts are undertaken to minimize non-sampling error, the success of these
measures cannot be quantified.
Steps taken to reduce non-sampling error include strategies to reduce non-response and efforts
to minimize measurement and coding errors. To this end, implementation and design of the
2009 GA Survey incorporated the following steps to maximize cooperation among sample
members:
•
Two modes of administration to facilitate access to the survey—a postcard invitation
to complete the survey on the Internet followed by a mail survey to be completed by
pen or pencil.
•
Three mailings of the survey to individuals who had not yet responded, as well as a
reminder/thank-you postcard.
A-11
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
•
Cover letters accompanying each survey mailing clearly explained the purpose of the
survey as well as the endorsement (organizational logos) of several aviation
associations.8
•
Cover letters assured owners of the confidentiality of their responses and informed
them: “Names of individuals are never associated with responses. There is an
identification number on your survey only so [survey contractor] knows who should
receive the survey.”
•
Use of additional sources to obtain updated contact information and help ensure the
mail survey reaches the sample member (e.g., National Change of Address, updates
from aviation associations).
•
Use of a toll-free telephone number and e-mail address to respond to questions.
•
Collaboration with aviation organizations and industry groups to encourage
cooperation of owners or operators of multiple aircraft.
•
Telephone follow-up to owners/operators of multiple aircraft who had not yet
responded as well as telephone follow-up to owners/operators of single aircraft that
started, but did not complete, the survey.
The survey efforts also minimize measurement error by increasing the likelihood that
respondents share a common understanding of survey questions and reducing errors in data
coding. These efforts include:
•
Close collaboration with the FAA, other federal agencies, and aviation groups to
refine and clarify question wording as well as definitions to questions. The
questionnaire is re-examined each year to identify ambiguities or revisions necessary
to remain consistent with aviation regulations and definitions.
•
Significant reviews and re-designs of the questionnaire have been undertaken
periodically (see “Background” section of this report). Significant revisions are
thoroughly pre-tested with a sample of aircraft owners or operators and, if necessary,
modified on the basis of the pre-test results.
•
Comprehensive editing and verification procedures to ensure the accuracy of data
transcription to machine-readable form as well as internal consistency of responses.
We undertake extensive effort to reduce measurement error, particularly where we can
anticipate systematic or repeated error on the part of survey respondents, but it is impossible to
eliminate all measurement error. Survey participants may misunderstand questions or misreport
8
The following associations’ logos appear on the 2009 cover letter and the introduction page of the Internet survey:
Aircraft Owners and Pilots Association (AOPA), Experimental Aircraft Association (EAA), General Aviation
Manufacturers Association (GAMA), Helicopter Association International (HAI), Light Aircraft Manufacturers
Association (LAMA), National Agricultural Aviation Association (NAAA), National Air Transportation Association
(NATA), National Business Aviation Association (NBAA), and Regional Air Cargo Carriers Association (RACCA).
Surveys mailed to Alaska addresses included an insert in which the directors of three Alaska-based associations
encouraged recipients to respond (Alaska Airmen’s Association, Alaska Air Carriers Association, and the Medallion
Foundation).
A-12
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
flight activity in ways that cannot be anticipated or prevented through survey or questionnaire
design. Where survey reports appear nonsensical or contradict FAA regulations (e.g., lighterthan-air aircraft providing air medical services), we manually verify that the data were processed
accurately. Instances in which a small number of illogical reports occur may be suppressed and
are indicated in table notes. No additional steps are taken to “cleanse” the data of apparently
illogical reports or assign them to other categories. To do so would introduce additional and
systematic error that would be misleading and would affect other uses of the data.
Imputation of Missing Data
Since the 2000 survey year, the survey questionnaire has undergone re-design efforts and data
collection methods have been developed to reduce item non-response. In 2009, less than five
percent of survey responses are missing data on the main reporting variables and the rate is
less than two percent for sampled aircraft administered the full survey form. Imputation rates are
higher on some variables, because the questions are not asked on the abbreviated survey form
that is used for owners/operators of multiple aircraft. Other variables with relatively high rates of
imputation are “Year of manufacture,” which is drawn from Registry files, and “state primarily
flown.” While “state primarily flown” is rarely missing, many answers cannot be coded to a single
state; for example, respondents list more than one state, identify a region of the country, or
simply indicate “US.”
Values are imputed for selected variables if the survey response is incomplete, the survey form
did not include the question, or the Registry data field is blank. For most variables, a nearestneighbor imputation procedure is used so that missing data are replaced with values based on
an aircraft with otherwise similar characteristics. Data are sorted by aircraft characteristics and
starting values are selected randomly within that sorted sequence.
Table A.7 lists the variables for which values are imputed, describes the imputation procedure,
and shows the percentage of cases with imputed data. The last column shows the percentage
of cases with imputed values excluding owners/operators that completed an abbreviated survey
form. Percentages are based on unweighted number of survey responses (total 36,222).
Table A.7: Variables with Imputed Values, Imputation Procedure, and Percentage Imputed
Variable
Imputation Procedure
Percent
Imputed
Percent
Imputed
(exclude large
fleet)
Year of manufacture
(Registry data field)
Nearest neighbor by aircraft type by engine
manufacture model
11.0
9.8
State primarily flown
Assign state of registration from Registry
Master
25.1
23.5
Airframe hours *
Nearest neighbor by aircraft type by age
22.2
1.8
Percentage of hours by
use (e.g., personal,
business transport)
Mean values by aircraft type
2.7
0.7
Percentage of hours
rented/leased *
Nearest neighbor by aircraft type by engine
manufacture model
21.9
1.5
Percentage of hours
public use
Nearest neighbor by aircraft type by engine
manufacture model
2.2
1.6
A-13
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Variable
Imputation Procedure
Percentage of hours
fractional ownership
Nearest neighbor by aircraft type by engine
manufacture model
Percent of hours by
flight plans/flight
conditions *
Mean values by aircraft type
Number of landings
Percent
Imputed
Percent
Imputed
(exclude large
fleet)
0.3
0.4
21.9
1.5
Nearest neighbor by aircraft type by engine
manufacture model by age
4.3
2.8
Landing gear *
Nearest neighbor by aircraft type by engine
manufacture model
22.1
1.8
Fuel type *
Nearest neighbor by aircraft type by engine
manufacture model
22.3
2.0
Fuel burn rate
Nearest neighbor by aircraft type by engine
manufacture model
3.7
1.8
Avionics equipment *
Nearest neighbor by aircraft type by engine
manufacture model by age
27.5
3.2
Percentages are based on unweighted survey responses (total 36,222).
* Question not asked on the abbreviated survey form administered to owners/operators of multiple aircraft.
Survey Content
The 2009 GA Survey questionnaire requests the aircraft owner or operator to provide
information on flight activity, flight conditions, where the aircraft was flown, and aircraft
characteristics. Key variables derived from the survey responses include:
•
number of total hours flown in 2009, hours flown by use, and total lifetime airframe
hours
•
the state in which the aircraft was flown most of the survey year and hours flown in
the state of Alaska
•
hours flown by flight plan and flight conditions, including flight under Instrumental
Meteorological Conditions (IMC) and Visual Meteorological Conditions (VMC) during
the day and night
•
hours flown as part of a fractional ownership program, rented or leased, or used to
fulfill a government function
•
type of landing gear and number of landings in 2009
•
fuel type and average fuel burn rate
•
avionics equipment installed in the aircraft, including ice protection systems.
Two changes were made to the 2009 survey questionnaire to improve the quality of data on
avionics equipment. As described in Table A.2, two items about installed general equipment
A-14
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
(Air Bag/Ballistic Parachute) and installed transponder equipment (ADS-B (Mode S)) were
refined to collect more detail.
Data Collection Methods
Collecting Data from Owners/Operators of a Single Aircraft
Appendix B presents the materials used to conduct the 2009 survey. The survey form
administered to owners/operators of a single aircraft is shown in Figure B.1. The postcard
invitation to the Internet component and the reminder/thank-you postcard are shown in Figure
B.2. Surveys sent to aircraft owners who started, but did not complete, an Internet survey
included a special insert (Figure B.3). Surveys mailed to Alaskan addresses included an insert
with the endorsement of Alaska aviation associations encouraging owners to participate (see
Figure B.4). Each of the three mailings for the survey was accompanied by a cover letter, shown
respectively in Figures B.5, B.6, and B.7.
The protocol used for the 2009 survey is similar to that used since the 2000 survey. The survey
data were collected from owners and operators of the sampled aircraft through two venues—the
Internet and mailings of the questionnaire. We implemented the Internet component before the
mailing portion to maximize the number of responses collected electronically. We first sent the
owners/operators of sampled aircraft a postcard inviting them to complete the survey on the
Internet (mailed on April 30, 2010). The Internet survey site remained open through August 31,
2010.
We mailed survey questionnaires to owners/operators of sampled aircraft three times during the
field period as well as a reminder/thank you postcard between the first and second mailings.
Each mailing was sent to owners/operators that had not yet responded to the survey at that time
and had not been assigned a final disposition (e.g., refused, respondent deceased,
undeliverable with no new address). We mailed the first questionnaire on May 27, 2010,
followed by the reminder/thank you postcard on June 11, 2010. The second and third mailings
were sent July 1, 2010 and July 23, 2010, respectively.
Starting with the 2009 survey, we also placed telephone follow-up calls to owners/operators of
individual aircraft that started but did not complete the Internet survey or returned an incomplete
mail survey. Telephone staff encouraged owners/operators to complete the survey and offered
technical assistance, but the survey itself was not conducted by telephone.
Collecting Data from Owners/Operators of Multiple Aircraft
The 2009 GA Survey continued the effort initiated in 2004 to increase cooperation among
respondents who own or operate multiple aircraft. The 2009 survey employed data collection
tools and methods similar to those introduced in 2004, including extensive effort to contact
owners/operators of multiple aircraft by telephone to encourage participation among nonresponders after the first mailing. The survey forms, cover letters, and reminder letter are
presented in Appendix B, Figures B.8–B.12.
The responses of multiple-aircraft owners/operators are important for accurately estimating
general aviation activity. Because of the increased burden of reporting for multiple aircraft, there
was a concern that these operators were less likely to respond to the survey. After selecting the
sample, we identify groups of aircraft belonging to the same operator using three resources:
FAA’s Operations Specifications Subsystem (OPSS), databases available from aviation
A-15
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
associations, and the Civil Aviation Registry’s Master file. Operators or owners with three or
more aircraft are classified as “multiple owners/operators” or “fleets” for survey purposes.
Owners/operators of multiple aircraft receive an abbreviated survey form to minimize the
reporting burden. The form, developed in cooperation with several aircraft operators and
aviation associations, allows an operator to report a summary of activity for a group of aircraft of
a similar type instead of requiring the operator to complete a separate and longer questionnaire
for each individual aircraft. This survey form (Appendix B, Figure B.8) collects data on key
variables for major classes of aircraft (e.g., hours flown, how flown, fuel consumption, fractional
ownership, and number of landings). The form does not collect data on flight conditions, fuel
type, landing gear, or avionics.
Data collection for multiple-aircraft owners/operators followed the same timing as that for
owners/operators of single aircraft. Like the standard survey protocol, we programmed an
Internet survey that matched the hard-copy survey form and the on-line survey remained open
throughout the field period. We mailed survey questionnaires three times during the field period
as well as a reminder letter between the first and second mailings. Each mailing was sent to
owners/operators of multiple aircraft that had not yet responded to the survey at that time and
had not been assigned a final disposition. The first survey mailing was sent May 10, 2010
followed by a reminder letter on May 28, 2010. The second and third mailings were sent June
11, 2010 and July 23, 2010, respectively.
To maximize survey response, we placed follow-up telephone calls to all multiple-aircraft
owners/operators who had not responded. The telephone effort, which was prioritized by fleet
size, began June 7, 2010 and continued through the field period. The calling effort focused on
encouraging survey participation as well as ensuring that survey mailings were reaching the
appropriate person in the operator’s organization.
The alternate survey form for owners/operators of multiple aircraft has reduced respondent
burden and improved representation of activity among high-end and high-use aircraft. The
alternate data collection track for owners/operators of multiple aircraft consistently accounts for
approximately 20 percent or more of all aircraft responding to the survey (21.1 percent of all
survey completes in 2009, 24.8 percent of all survey completes in 2008, 20.5 percent in 2007,
and 22.8 percent in 2006).
Response Rate
The response rate is calculated conservatively following guidelines published by the American
Association for Public Opinion Research (AAPOR), a professional association that establishes
standards, “best practice” guidelines, and a code of ethics for professional survey researchers
and research firms.9 Specifically, the response rate is computed as the number of completed
and partial surveys returned divided by the total number of eligible aircraft in the sample using
the following formula.
RR = (C + P) / (C + P) + (NR + INS + REF + PMR + UNK)
9
The American Association for Public Opinion Research. 2000. Standard Definitions: Final Dispositions of Case
Codes and Outcome Rates for Surveys. Ann Arbor, MI: AAPOR.
A-16
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Where
RR = Response Rate
C = Completed survey
P = Partial survey
NR = No response
INS = Insufficient complete; a partial survey that is not sufficient to count as a complete
REF = Refused
PMR = Post Master Returned, no new address
UNK = Unknown eligibility
The numerator is comprised of completed surveys and partial surveys that provide enough
information to be used for analysis. Partial surveys must include information on hours flown to
be included in the numerator.
In addition to completed and partial surveys, the denominator includes cases for which no
response was received, insufficiently completed surveys (i.e., no data reported for hours flown),
refusals, surveys returned as undeliverable by the USPS, and cases of unknown eligibility. The
last category includes aircraft in which the owners cannot be identified or cannot report about
aircraft activity (e.g., owner is deceased and the survivors cannot report on the aircraft activity,
survey recipient does not own the aircraft listed).
The denominator includes aircraft that were sold or destroyed during the survey year. The
survey collects data on flight activity for the portion of the year the aircraft was eligible to fly, and
data collection efforts attempt to identify and mail surveys to new owners.
The denominator excludes aircraft known not to be part of the general aviation fleet or known
not to be eligible to fly during the survey year. These are aircraft that were destroyed prior to the
survey year, displayed in a museum, operated primarily as an air carrier, operated outside of the
US, or exported overseas.
Table A.8 shows the final response rate by mailing and overall, along with the number of
completed surveys. The number of completed surveys shown here excludes duplicate surveys
after cleaning the returned survey data to retain the form with the most complete information.
The overall response rate for the 2009 GA Survey was 42.9 percent. Almost 60 percent of
responses were received on the Internet and slightly more than one-quarter were received from
the first mailing. The second and third mailings had lower response, but these rates are
calculated conservatively. For example, the Mail 3 response rate is the proportion of sampled
aircraft that returned that hard-copy survey. If a third mailing was sent, but the survey was later
completed on-line then the response is recorded as “Internet.”
A-17
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Table A.8: Response Rate by Mailing
Mailing
Completes
Internet
Response Rate
% Total Response
20,985
24.8%
57.9%
9,879
11.6%
27.3%
Mailing
3,263
3.8%
9.0%
3 Mailing
2,095
2.5%
5.8%
36,222
42.9%
100.0%
st
1 Mailing
2
nd
rd
Overall
As noted above, the response rate is calculated conservatively and retains all non-responding
surveys, sampled units with bad addresses, and sampled aircraft of unknown eligibility in the
denominator. In the 2009 survey, 5,871 surveys were returned undeliverable and we were
unable to obtain updated address information. In addition, the survey sample itself included over
5,000 aircraft that could not be contacted because their status was “Sale Reported,”
“Registration Pending” or the address was already known to be incorrect (i.e., Postmaster
Return status on the Registry). Applying guidelines for defining the GA population developed
with the FAA and Registry staff, these aircraft are deemed potentially active and therefore
eligible for selection into the survey.
Table A.9 illustrates the steady increase in the Internet response as a percentage of all returned
surveys from 2000 to 2009. Almost 60 percent of survey responses were received by Internet in
2009, and the share of Internet response was very similar to the previous survey year (57.9
percent of all responses). Since the survey was first made available on-line in 2000, the share of
Internet responses has increased by 43 percent (from 32.6 to 57.9 percent). The growth in
response via the Internet has made it possible to field an expanded GA Survey, manage larger
sample sizes, and process more data efficiently and cost effectively.
Table A.9: Percentage of All Completed Surveys Responding by Internet
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
Total Sample Size
31,039
30,886
30,817
31,996
75,659
77,403
84,486
84,570
82,277 85,086
Total Completes
15,689
16,432
15,254
14,471
32,056
34,248
38,973
38,920
35,607 36,222
Internet Completes
5,144
5,954
5,304
6,059
13,441
14,555
17,266
19,268
20,611 20,985
Internet % of Total
32.8%
36.2%
34.8%
41.9%
41.9%
42.5%
44.3%
49.5%
57.9%
57.9%
Table A.10 shows response rates by aircraft type. Responses rates for most aircraft types are
very similar to the previous survey year. Fixed wing–turbojets and rotorcraft are the exceptions;
response rates among these aircraft dropped slightly compared to 2008. Light-sport aircraft
continue to have a higher response rate than most other categories, probably reflecting their
more recent registration dates (and therefore more accurate contact information).
A-18
Appendix A: Methodology for the 2009 General Aviation
and Part 135 Activity Survey
Table A.10: Response Rate by Aircraft Type
Aircraft Type
Invalid
10
Sample
Sample
Completes
Response
Rate
Fixed Wing - Piston
1 engine, 1-3 seats
6,070
26
2,495
41.3%
1 engine, 4+ seats
14,970
68
6,087
40.8%
2 engines, 1-6 seats
5,060
40
2,055
40.9%
2 engines, 7+ seats
2,862
84
1,116
40.2%
1 engine
4,293
54
1,966
46.4%
2 engines, 1-12 seats
4,653
28
1,696
36.7%
2 engines, 13+ seats
1,019
3
360
35.4%
12,586
173
5,245
42.3%
Piston
5,246
23
1,538
29.4%
Turbine: 1 engine
5,819
24
2,564
44.2%
Turbine: Multi-engine
1,658
22
865
52.9%
Glider
1,922
10
893
46.7%
Lighter-than-air
3,183
27
1,232
39.0%
Amateur
6,612
44
3,934
59.9%
Exhibition
1,978
18
865
44.1%
Experimental: Other
2,133
20
821
38.9%
5,022
11
2,490
49.7%
85,086
675
36,222
42.9%
Fixed Wing - Turboprop
Fixed Wing - Turbojet
2 engines
Rotorcraft
Other Aircraft
Experimental
Light-sport
Total
10
Even though efforts are made to remove non-GA aircraft from the population before the sample is selected, a small
number of surveys are returned indicating that the aircraft should not be part of the survey population (e.g., the
aircraft was used primarily as a Part 121 air carrier, or was a museum piece the entire survey year). The Invalid
Sample represents such aircraft, which are excluded from response rate calculations.
A-19
File Type | application/pdf |
File Title | 2009 GA Survey Appendix A Methodology FINAL 11 29 10 |
Author | Heidi.Vargas |
File Modified | 2010-11-30 |
File Created | 2010-11-30 |