Download:
pdf |
pdfSupporting Statement for Paperwork Reduction Act Submission:
Coral Reef Valuation Study
OMB CONTROL No. 0648-xxxx
U.S. Department of Commerce
National Ocean and Atmospheric Administration
National Ocean Service
Office of National Marine Sanctuaries and Office of Response and Restoration
1305 East West Highway, SSMC4, 9th floor
Silver Spring, MD 20910
Contact: Norman Meade
(301) 713-4248 ext. 201
Norman.Meade@noaa.gov
March 10, 2009
Table of Contents
A.
Justification ........................................................................................................................1
1.
Explain the circumstances that make the collection of information necessary .......1
Background ..............................................................................................................1
Request.....................................................................................................................1
2.
Explain how, by whom, how frequently, and for what purpose the
information will be used. If the information collected will be disseminated
to the public or used to support information that will be disseminated to the
public, then explain how the collection complies with applicable NOAA
Information Quality Guidelines ...............................................................................1
How the information will be collected.....................................................................1
The main survey instrument.....................................................................................1
General instructions to KN and Abt SRBI operations .................................4
Instructions/warm-up: (Screens 1 through Screens 2C) ..............................4
Part 1: Survey setup (Screens 3A through 3C) ............................................4
Part 2: Introduction (Screens 4A through 12B; Questions Q1
through Q5) ..................................................................................................5
Part 3: Overfishing (Screens 13A through 16D; Questions Q6
through Q7) ..................................................................................................6
Part 4: Ship accidents (Screens 17A through 19B; Questions Q8
through Q9) ..................................................................................................6
Part 5: Choice questions/follow-up evaluation (Screens 20A
through 41; Questions Q10 through Q28, A1 through A2a, and
D1 through D2) ............................................................................................7
Use of illustrations ...................................................................................................9
Experimental design...............................................................................................10
Use of stated choice questions ...............................................................................14
Survey mode ..........................................................................................................16
Pretest survey .............................................................................................16
Main survey ...............................................................................................16
Frequency of the information collection ................................................................18
How collection complies with NOAA information quality guidelines .................18
Utility .........................................................................................................18
Objectivity..................................................................................................18
Integrity ......................................................................................................19
3.
Describe whether, and to what extent, the collection of information
involves the use of automated, electronic, mechanical, or other
technological techniques or other forms of information technology .....................19
Automated, electronic data collection....................................................................19
4.
Describe efforts to identify duplication .................................................................20
i
5.
If the collection of information involves small business or other small
entities, describe the methods used to minimize burden........................................20
6.
Describe the consequences to the Federal program or policy activities
if the collection is not conducted or conducted less frequently .............................20
7.
Explain any special circumstances that require the collection to be
conducted in a manner inconsistent with OMB guidelines ...................................20
8.
Provide information on the PRA Federal Register Notice that solicited
public comments on the information collection prior to this submission.
Summarize the public comments received in response to that notice and
describe the actions taken by the agency in response to those comments.
Describe the efforts to consult with persons outside the agency to obtain
their views on the availability of data, frequency of collection, the clarity
of instructions and recordkeeping, disclosure, or reporting format (if any),
and on the data elements to be recorded, disclosed, or reported............................21
9.
Explain any decisions to provide payments or gifts to respondents, other
than remuneration of contractors or grantees.........................................................21
Cognitive one-on-one interviews ...............................................................21
Pretest survey .............................................................................................21
Main survey ...............................................................................................22
Survey-specific incentives .........................................................................22
Nonsurvey-specific incentives ...................................................................22
10.
Describe any assurance of confidentiality provided to respondents and
the basis for assurance in statute, regulation, or agency policy .............................23
KN procedures ...........................................................................................23
Abt SRBI procedures .................................................................................25
11.
Provide additional justification for any questions of a sensitive nature,
such as sexual behavior and attitudes, religious beliefs, and other matters
that are commonly considered private ...................................................................26
12.
Provide an estimate in hours of the burden of the collection of information ........26
13.
Provide an estimate of the total annual cost burden to the respondents or
record-keepers resulting from the collection (excluding the value of the
burden hours in #12 above)....................................................................................27
14.
Provide estimates of annualized cost to the Federal government ..........................27
15.
Explain the reasons for any program changes or adjustments reported in
Items 13 or 14 of OMB 83-I ..................................................................................27
ii
B.
16.
For collections whose results will be published, outline the plans for
tabulation and publication ......................................................................................28
17.
If seeking approval to not display the expiration date for OMB approval
of the information collection, explain the reasons why display would
be inappropriate .....................................................................................................28
18.
Explain each exception to the certification statement identified in
Item 19 of the OMB 83-I .......................................................................................28
Collections of Information Employing Statistical Methods .........................................29
1.
Describe (including a numerical estimate) the potential respondent
universe and any sampling or other respondent selection method to
be used. Data on the number of entities (e.g., establishments, State
and local governmental units, households, or persons) in the universe
and the corresponding sample are to be provided in tabular form.
The tabulation must also include expected response rates for the
collection as a whole. If the collection has been conducted before,
provide the actual response rate achieved ..............................................................29
This application is for the cognitive one-on-one interviews, a second
pretest, and the main survey study only .................................................................29
Cognitive one-on-one interviews ...............................................................29
Pretest survey implementation ...................................................................29
Main survey implementation .....................................................................29
2.
Describe the procedures for the collection, including: the statistical
methodology for stratification and sample selection; the estimation
procedure; the degree of accuracy needed for the purpose described
in the justification; any unusual problems requiring specialized
sampling procedures; and any use of periodic (less frequent than annual)
data collection cycles to reduce burden .................................................................30
Sample frame and sample selection .......................................................................30
Pretest survey .............................................................................................30
Main survey ...............................................................................................30
KN Panel sampling design for the main survey.....................................................31
ANES Web Panel recruitment response rate statistics ..............................32
Abt SRBI Panel sampling design for the main survey ..........................................33
Sample size ............................................................................................................34
Cognitive interviews ..................................................................................34
Pretest survey .............................................................................................34
Main survey ...............................................................................................34
iii
3.
Describe the methods used to maximize response rates and to deal with
nonresponse. The accuracy and reliability of the information collected
must be shown to be adequate for the intended uses. For collections
based on sampling, a special justification must be provided if they
will not yield “reliable” data that can be generalized to the universe studied .......36
Maximizing response rates ....................................................................................37
Nonrespondents......................................................................................................38
4.
Describe any tests of procedures or methods to be undertaken. Tests are
encouraged as effective means to refine collections, but if ten or more test
respondents are involved, OMB must give prior approval ....................................38
5.
Provide the name and telephone number of individuals consulted on the
statistical aspects of the design, and the name of the agency unit, contractor(s),
grantee(s), or other person(s) who will actually collect and/or analyze the
information for the agency .....................................................................................39
Bibliography ..................................................................................................................................41
Attachment 1: Coral Reef Survey Instrument
Attachment 2: Write-up of Pretest Results
Attachment 3: KN’s Member Bill of Rights
Attachment 4: Quality Assurance Procedures
Attachment 5: Illustrations
Attachment 6: Authorities
Attachment 7: Federal Register Notification
iv
A.
JUSTIFICATION
1. Explain the circumstances that make the collection of information necessary.
Background
The National Oceanic and Atmospheric Administration (NOAA) is a member of the United
States (U.S.) Coral Reef Task Force (CRTF), which was established in June 1998 through
Executive Order (EO) 13089. As a member of the CRTF, and in support of the U.S. Coral Reef
Initiative, NOAA has significant responsibilities for managing U.S. coral reef habitats and
undertaking scientific research studies to better understand the nation’s coral reef resources (see
Attachment 6 for a full list of authorities).
NOAA currently manages three National Marine Sanctuaries (NMS) with coral reef resources
under the National Marine Sanctuaries Act (NMSA, 16 U.S.C. 1431, et seq.): the Florida Keys
National Marine Sanctuary (FKNMS), the Flower Gardens Bank National Marine Sanctuary ,
and the Hawaiian Islands Humpback Whale National Marine Sanctuary. NOAA also has the
authority to conduct research to understand the use of MPAs under EO 13158. In order to more
efficiently conduct surveys about public preferences for attributes of marine environmental
resources, including coral reef ecosystems, located in the U.S.A, NOAA needs to conduct
research on the validity and reliability of using Internet-based panels of respondents recruited
using alternative recruitment methods and which result in varying, underlying response rates.
Request
This information collection request is for conducting up to 32 cognitive one-on-one interviews to
test changes made since the first pretest, conducting a second pretest using Knowledge
Networks’s (KN’s) established Internet Panel, and implementing the main survey concurrently
using the June 2009 wave of the American National Election Study (ANES) Internet Panel
recruited by KN and Stanford’s Major Research Instrumentation (MRI) Internet Panel recruited
by Abt SRBI, a subsidiary of Abt Associates. There is limited opportunity to have outside
research projects use these panels. Because of the ANES and MRI Panel schedules, the only
opportunity for administration of the coral reef survey is the June 2009 wave. We request the
Office of Management and Budget (OMB) approval to administer the survey on the June
2009 wave.
2. Explain how, by whom, how frequently, and for what purpose the information will be
used. If the information collected will be disseminated to the public or used to support
information that will be disseminated to the public, then explain how the collection
complies with applicable NOAA Information Quality Guidelines.
How the information will be collected
This request is for up to 32 cognitive one-on-one interviews, a second pretest, and the full-scale
implementation of an Internet-based survey instrument designed to estimate individuals’
preferences and economic values of the Hawaiian coral reef ecosystem. Four members of the
project team will conduct up to 16 cognitive one-on-one interviews on 2 consecutive nights in
Denver, Colorado the week of March 30, 2009 and another 16 interviews in Washington D.C.
Page 5
the week of April 6, 2009. These members will include staff from NOAA and from Stratus
Consulting. We will randomly select established KN Panel members (those that are separate
from the ANES Panel, which is developed by KN) to participate in the one-on-one interviews.
Participants will take the survey using a computer with Internet connection, a set-up similar to
conditions for the main survey. The purpose of these interviews is to test participants’
understanding of the material presented to them and to evaluate certain sections of the survey or
wording issues. Information gathered from these interviews is not intended to be used to make
major changes to the survey instrument.
The next activity for which we are seeking approval once the cognitive interviews are completed
is conducting a second pretest. Based on review by OMB and others, we have made changes to
the survey instrument (see the next section called “The main survey instrument”) since the first
pretest, including changes to the experimental design. We plan to test these changes using a
randomly selected sample of 385 established KN Panel members (of this sample, we expect to
get 250 completed interviews). The purpose of this pretest is to test the programmed survey
instrument. We need to make sure that there are no issues with the programming of the survey
and that the experimental design (as described below) is appropriate for the full administration of
the main survey. This pretest will not involve any participants from the ANES or MRI Panels.
Once KN sends us the pretest data, we will analyze it using simple summary statistics, develop a
presentation on the results, and make any changes, if necessary, to the programming and/or
experimental design before implementing the main Internet survey using the ANES and MRI
Panels in June 2009. The pretest is currently scheduled to begin the week of April 20, 2009 and
is expected to be in the field for three weeks.
The main survey will be administered to two independent Internet Panels: the ANES Internet
Panel and the MRI Internet Panel. KN will develop the ANES Panel and Abt SRBI will develop
the MRI Panel. For the ANES Panel, KN has recruited a sample of 2,000 panel members using
Random Digit Dialing (RDD). Abt SRBI has recruited a sample of 990 panel members from the
U.S. Postal Service (USPS) mailing address lists for the MRI Panel.
Most Internet-based surveys currently depend on RDD recruiting to build their panels. This type
of recruiting involves several steps, including initial and follow-up telephone calls,
administration of a screener, and recruitment of the panel. At each step, potential panelists are
lost, which results in potentially low overall survey response rates of 20%. The
representativeness of telephone-recruited panels is then subject to question, but it has been
difficult to evaluate the actual extent of nonresponse biases if any.
KN and Abt SRBI are developing the two Internet Panels independent of this specific data
collection effort under a grant from the National Science Foundation (NSF). The panels are being
recruited to administer a series of surveys over multiple months (waves). These panels are part of
a research project (designed by KN and Abt SRBI in cooperation with Professor Jon Krosnick of
Stanford University and others) to evaluate the representativeness of RDD-recruited Internet
Panels. Abt SRBI will recruit the MRI Panel by contacting households face-to-face, which is
expected to generate an overall response rate of approximately 63%. 1 Results from the surveys
1. This value comes from an NSF-sponsored demonstration project on face-to-face recruitment of an internet
survey panel.
Page 6
performed by the MRI Panel can then be used to evaluate the representativeness of the ANES
Panel survey results. More details on procedure are provided below.
We propose using the June 2009 survey wave to collect information for our Coral Reef
Economic Valuation Study survey. In this way, our survey can depend heavily on KN’s ANES
Panel for cost-effective survey implementation. At the same time, we can use responses from
surveying the MRI Panel to evaluate any nonresponse biases in the RDD results.
We plan to test estimates of total value obtained from the two panels that use different recruiting
methods and which result in different underlying response rates. The survey focuses on
overfishing and ship groundings, which are among the most widespread threats to the reef
ecosystems. Two methods of protection are presented in the survey: (1) restoration of the coral
reef ecosystems of the MHI through the establishment of a special category of MPAs known as
marine reserves or no-take areas; and (2) restoration of coral habitats after vessel groundings.
The survey uses a stated choice conjoint framework to evaluate the willingness of study subjects
to trade off these actions against each other at a cost and against taking no action. The survey has
gone through extensive design, development, qualitative research, and pretesting. After we
complete the one-on-one interviews and second pretest, the main survey will be ready to be fully
implemented using the ANES and MRI Panels.
The study effort involves three main phases. Phase 1, which has been completed, involved the
development of the initial survey instrument, qualitative research through focus groups and
cognitive one-on-one interviews to test out and improve the instrument, and implementation of a
small-scale pretest of the survey instrument, which OMB has previously approved (OMB NO.
0648-0531, expired 8/31/2006). Results of the pretest are provided in Attachment 2. In Phase 2,
the current phase of this project, we will complete the three tasks described above that require
OMB approval. Phase 3 of the effort will involve analysis of the survey data and development of
a final study report. Details relating primarily to Phase 2 are presented in this supporting
statement, because this application is to conduct cognitive one-on-one interviews, a second
pretest, and the main survey.
The main survey instrument
This instrument has been revised in light of peer review, the first pretest results, and initial
comments from OMB. The version included in this package differs from the executable version
in format only. An online address to review the Web Interface version will be provided to OMB
as soon as all the screens have been programmed correctly. We designed the survey to provide
respondents with adequate levels of information about the MHI coral reef ecosystems, the
problems facing such ecosystems, and potential management actions that might be undertaken to
help protect and restore them. To ensure that the information is accurate, numerous coral reef
researchers reviewed the survey instrument.
Throughout the development and presentation of materials in the survey, we have strived to
present information in a balanced, neutral manner. Discussions of details of this balance are
provided in the individual sections below.
Page 7
As the information is presented, it is divided into sections by questions designed to encourage
review and consideration of survey information and to provide us with feedback on respondents’
preferences based on the information they have seen up to that point.
For purposes of review and comment, the survey instrument included in this submission has
labels for Screens, where the information and/or questions will appear in the online executable
version. The question numbers will not appear in the executable version; they are used only to
track comments and suggestions in the review process.
Summaries of the major sections of the main survey follow.
General instructions to KN and Abt SRBI operations
The first page will not appear in the application. This is a tracking sheet for internal operations at
KN and Abt SRBI. The general instructions to the KN and Abt SRBI operations begin on the
second page. These instructions lay out several features and capabilities that the researchers and
funding agencies want KN and Abt SRBI to implement in the online executable version of the
questionnaire.
Instructions/warm-up: (Screens 1 through Screens 2C)
Screen 1 begins the survey. The survey is presented with and without audio (some panel
members may not have audio capability). Screen 1 and Screen 2A test whether the panel member
has audio capability to determine which version of the survey they will receive. Question S2A
asks if the panel member heard the music for testing the audio capability. If the respondent
answers “yes” they will see Screen 2B, which informs them that some instructions are given by
audio and that they should turn up their audio. Respondents are also reminded to read the screen
carefully, even if audio is provided.
Screen 2C presents respondents with a question (Q2D1 or Q2D2) from the General Social
Survey (GSS), which is placed at the beginning of the survey to serve both as a warm-up and to
provide information to help evaluate potential attitudinal differences between the respondents to
our survey and respondents to the GSS. Half the sample will randomly receive Question Q2D1
and half the sample will randomly receive Question Q2D2.
Part 1: Survey setup (Screens 3A through 3C)
Part 1 introduces the topic of the survey: management options for coral reefs in Hawaii. It gives
the initial explanation of the purpose of the survey and explains why respondents’ opinions are
important. It explicitly identifies NOAA as a U.S. government agency funding the survey. The
NOAA logo will be prominently displayed on the initial screen of the survey. At the bottom of
Screen 3A, panel members (respondents) are informed that their participation is voluntary and
are provided an opportunity to obtain more information. Respondents wanting more information
are sent to Screen 3B, where information is provided about the policies regarding survey
participation and efforts to protect their privacy (see Attachment 3 for KN’s Panel Member Bill
of Rights). Respondents are also provided an 800-telephone number if they have any questions.
Page 8
Screen 3C informs respondents that this survey will present information about coral reefs,
including pictures and maps. Respondents also learn that they can move forward or backward in
the survey through links provided on the lower left corner of each screen, and return to wherever
they were in the survey before linking to any information.
Part 2: Introduction (Screens 4A through 12B; Questions Q1 through Q5)
The introduction presents information about coral reefs and coral reef ecosystems using text and
an illustration (Screen 4A). The text describes what a coral reef ecosystem is and where coral
reefs are found, highlighting the types of marine animals found on and near coral reefs. This
information is followed by Questions Q1 through Q3, which ask how often a respondent has read
or heard about coral reefs (on Screen 4), how many times he/she has been to a coral reef in the
U.S. or elsewhere (on Screen 5), and, if a respondent has been to a reef before, where this visit
occurred (on Screen 6). The responses to Question Q3 can be used to differentiate survey
respondents’ level of previous familiarity with coral reefs.
On Screen 7, respondents learn that 10% of coral reefs are found around the Hawaiian Islands;
most other coral reefs are found around Florida. A map is used to show respondents the location
of the MHI (on Screen 8). The text below the map communicates how extensive the MHI are and
how people use them. Screen 9 then shows another map of the Hawaiian Islands that highlights
the NWHI. The text below the map describes more about the NWHI. The introductory section
ends with Questions Q4, Q4A, and Q5, which ask whether respondents have either lived in or
visited Hawaii in the past and how likely they are to visit Hawaii in the next 10 years (on Screens
10A, 10B, and 11). These questions will be used to segment those whose values might include
direct economic use value versus those whose values would hold pure nonuse/passive economic
use values.
On Screen 12A, respondents see four pictures of reefs and reef fish and other sea life found on
Hawaii’s coral reefs. These pictures provide a transition between answering questions and
providing the next bit of information. The final screen in Part 2 (Screen 12B) highlights two
reasons why coral reef ecosystems around Hawaii are unique: (1) 25% to 50% of the species
found around the Hawaiian Islands do not occur anywhere else in the world, and (2) the NWHI
reefs are in a remote location and still in a relatively unaltered natural state (i.e., mostly
untouched by humans).
Part 3: Overfishing (Screens 13A through 16D; Questions Q6 through Q7)
This section introduces overfishing as the first of two main threats to coral reef health in the
MHI. The section first describes what is meant by “overfishing” and the ways that it can affect
reef health.
Illustrations are used to show respondents current conditions at the MHI and how the MHI
looked before overfishing occurred. By seeing the two illustrations side by side, respondents can
see that under conditions before overfishing occurred, there are more reef fish and healthier coral
ecosystems than under current conditions.
A solution to the overfishing problem — implementation of no-fishing zones — is then
described. Respondents learn that this management tool has been effective in other locations
Page 9
such as Florida to help improve coral reef health. Respondents are told that other activities such
as recreational diving can still occur in no-fishing zones. The text also highlights some
undesirable consequences associated with developing no-fishing zones, including additional
government spending, potential loss of commercial fishing jobs, and displacement of recreational
fishing. Presenting this information demonstrates to respondents that protection comes at a cost.
Following the discussion of no-fishing zones, Question Q6 asks respondents whether they agree
with statements about three issues: commercial fishing jobs, sport fishing opportunities, and
federal government involvement. This question serves two purposes. First, it breaks up the
presentation of important information and second, it provides additional information to assess
respondents’ preferences for protecting coral reefs.
Next, respondents learn about a proposal to increase no-fishing zones from the current 1% to a
new level of 25% of the coral reef ecosystems in the MHI. We present some reasons for and
against enlarging no-fishing zones to ensure that a balanced and neutral presentation on these
issues is given to respondents.
Screen 16A through 16C address the proposal in detail. Illustrations are used to compare
conditions in 10 years (1) if no-fishing areas continue to protect only 1% of coral reefs, and (2) if
no-fishing areas are expanded to protect 25% of coral reefs. Finally, Question Q7 asks
respondents if they have any comments about the information provided so far.
Part 4: Ship accidents (Screens 17A through 19B; Questions Q8 through Q9)
Part 4 introduces the second of the two main threats to coral reef health in the MHI: ship
accidents. Ship accidents occur about 10 times a year in the MHI, and these can significantly
impact a localized area of the reef. This section describes the effects of ship groundings in the
MHI and highlights the fact that natural recovery of the reefs from these groundings typically
takes about 50 years. During this time, a reef’s health, and many of the coral reef-associated
activities such as snorkeling and diving, may be affected. The ship grounding scenario provides a
description of localized impacts on ecosystem health, contrasting with the broader effects
associated with overfishing. It is included to help elicit a range of values for the types of
management actions that are available to help improve the coral reef health in the MHI.
Illustrations are used to compare current conditions of coral reefs in the MHI and conditions
immediately following a ship accident (e.g., a damaged reef).
On the next screen, Question Q8 asks respondents whether they ever heard about, read about, or
seen where ship accidents have injured coral reefs in Hawaii or elsewhere.
Next, respondents learn that management actions, such as planting living coral from coral farms
into injured areas and restoring injured coral that is still alive, could help the reef recover faster
after ship accidents (10 years rather than 50 years). This section explains that these actions have
been effective in other locations, such as Florida, in restoring the reefs in a much shorter period
compared to natural recovery.
The next screen tells respondents that the federal government, with the State of Hawaii, is
considering a new program to repair ship injuries to coral reefs that would repair about 10 sites
Page 10
(about 5 acres) each year. Again, as with the overfishing solution, the respondents are given the
pros and cons of this management action.
Respondents are told it is not possible to make boat and ship owners pay for repairs because it is
often difficult to track which ship caused the injury. This information helps avoid protest
amongst respondents who think it was not fair for them to pay for the injuries because the boat
and ship owners are responsible.
Finally, Question Q9 asks respondents if they have any comments about the information
provided so far.
Part 5: Choice questions/follow-up evaluation (Screens 20A through 41; Questions Q10 through
Q29, A1 through A2a, and D1 through D2)
In Part 5, respondents are asked to identify which combination, if any, of the management
actions they prefer. The two management actions (no-fishing zones, and restoration of ship
accident damages) are summarized, and a series of stated-choice questions is asked. To clarify
this task for respondents, a warm-up question is presented on Screen 20C. On this screen,
respondents will see the Current Program (the status quo) and either the No-Fishing Zone
Program or the Reef Repair Program. The question on the screen asks respondents to pick their
most preferred alternative out of the two choices.
Starting with Screen 20D, each stated choice question asks respondents to choose between the
presented programs, with each program described in terms of management actions and cost to
the respondent’s household in the form of increased federal income taxes. The Current Program
is always the status quo: no new no-fishing zones in the MHI, no additional efforts to restore
vessel grounding damages, and no additional taxes. The Full Program includes a combination of
new no-fishing zones in the MHI and additional efforts to restore vessel grounding damages,
which results in the greatest increase in new taxes. The No-Fishing Zones and Reef Repair
Programs include one management action beyond the status quo (based on their respective titles)
and some increase in taxes.
Respondents are asked in Question Q10 to specify which of the four programs is their most
preferred. Respondents are reminded to consider the effectiveness of each management option
and their budget constraints, and then asks respondents to specify their most preferred program
in Q10. Question Q11 then asks respondents to provide a brief comment explaining why they
chose the program they did. In particular, this information can help distinguish between true zero
values and protest answers. In addition, it allows the research team to better understand the
overall confidence that respondents had in their answers and whether or not respondents were
taking the choice task seriously. This question also provides a space for respondents to comment
on their answers to Question Q10. This can provide insights into the individual’s thought
process, and subsequently help identify valid and invalid responses. Second, it provides the
opportunity for individuals to express how they feel about being asked this type of question. This
is especially important for those respondents that clearly dislike some element of the question.
This comment question is not repeated for other choice questions because experience indicates
little additional information is gained from repeating the question.
Page 11
Next, respondents are presented with the remaining three programs and asked to check which of
these is their most preferred. The screen is then replaced once more with the two remaining
programs and respondents are asked to choose their most preferred. Each respondent is asked
three stated choice questions to limit potential respondent fatigue. As has become standard
practice in stated preference studies, we introduce a “certainty question” after each choice
question.
Asking respondents to identify their most preferred, next most-preferred, and then their most
preferred from the remaining two programs, provides a complete ranking of all the programs in
each choice set. Complete rankings provide potent information on preferences that will be very
useful in data analysis and value estimation.
Next, Screen 22 tells respondents they will answer some questions about what they were
thinking when choosing the programs they prefer. Question Q17 asks respondents whether or not
they believe overfishing has caused the changes in coral reefs they were told about earlier.
Questions Q18 and Q19 then ask how serious the effects of overfishing would be without
additional no-fishing zones and how effective no-fishing zones would be if adopted.
In respect to ship accidents, Question Q20 asks the respondents to evaluate how serious the
effects of ship accidents are on the MHI coral reef ecosystem. Questions Q21 and Q22 ask
respondents how effective they thought the Reef Repair Program would be in speeding up
recovery and if they thought recovery would take more than, less than, or about 10 years under
the Reef Repair Program.
Questions Q23-Q28 are used to evaluate the validity of the survey instrument. These questions
elicit respondent attitudes about the proposed programs in the instrument, various groups and
institutions in the Unites States, and their environmental attitudes.
Question Q29 asks if anyone in the household paid any federal income taxes in 2008. This will
be used to assess the use of federal income taxes as a payment vehicle.
Respondents who had audio were asked in Questions A1-A2a if they thought the audio
presentation was helpful and whether they thought additional audio instructions would have been
helpful. This is followed by Questions D1 and D2, which ask for information on the equipment
used by panel members to participate in the survey. This will allow assessment of differences in
survey responses by capabilities in receiving survey information.
Finally, the last screen reminds respondents that the survey is eliciting information useful to
NOAA and other agencies to estimate the value of coral reef ecosystems; it does not necessarily
represent actual government policy. These statements were developed in consultation with the
State of Hawaii and NOAA’s National Marine Sanctuary Program (NMSP). Peer reviewers were
adamant that these statements not be presented until respondents had completed and submitted
their survey responses.
Our plan is to administer the main survey using the ANES and MRI Internet Panels. When these
panels are recruited, a portion of each recruitment interview, independent of this specific Coral
Reef Valuation survey, will measure socio-demographics and other generic measures, including
Page 12
contact information and questions to gauge Internet access. 2 We will be able to utilize this
socioeconomic data for our analysis. Additionally, it will have the interviewers conduct a brief, ,
face-to-face interview; invite the respondents to accept a free computer and other incentives; and
join the panel. The ANES and MRI Panel members will agree to complete one questionnaire
every month via the Internet.
Use of illustrations
The survey is designed to solicit preferences from the ANES and MRI Panels on three coral reef
conservation programs: (1) increasing the no-fishing zones around the MHI from 1% to 25% of
coral reef ecosystems, (2) annually repairing coral injuries caused by ship groundings, and (3)
increasing the no-fishing zones and repairing coral injuries around the MHI. Obtaining reliable
expressions of individuals’ preferences for these programs requires that the respondents have an
accurate understanding of the potential future status of coral reef ecosystems with and without
these three programs.
This data collection effort is complicated by several factors. First, it is expected that the majority
of the respondents have never visited a coral reef ecosystem and, thus, are likely to be unfamiliar
with this habitat type, beyond what has been learned from television, movies, books, and
magazines. Second, given the complexity of the ecosystem, it is unlikely that the respondents
could develop a relatively complete mental image of the habitat by solely relying on the textual
information contained in the instrument. Third, because of the lack of direct personal experience
with the habitat, it is expected that coral reef ecosystem conservation will be an issue of low
salience for many respondents. Forth, the instrument contains significant technical detail on the
potential habitat changes that would result with and without the three alternative programs. If the
respondents are unlikely to be able to form a detailed mental image of a generic coral reef
ecosystem, they cannot be expected to picture the fine distinctions that would result from the
three conservation programs based solely on the textual descriptions.
To address these four challenges, the instrument includes a series of six professionally
developed, color illustrations (see Attachment 5). Appearing as pairs , these illustrations depict
how a typical reef location may appear with and without each management option. Including
illustrations in the instrument provides several benefits. First, for those respondents unfamiliar
with coral reefs, the illustrations provide a visual complement to the textual information. This
visual component will most likely strengthen respondents’ understanding of the habitat of
interest. Second, the graphics interspersed among the mostly text-based instrument are likely to
help respondents maintain interest and to focus on completing the questionnaire. This will act to
increase both the survey’s response rate and the accuracy of our data collection as the
respondents will be more engaged with the instrument. Finally, and most importantly,
illustrations allow the researchers to accurately display the potential different states of the
environment with and without the programs in a manner that can be comprehended by nearly all
respondents.
2. Recruitment information for the ANES and MRI panels are being collected independently of this request
and are not part of the Coral Reef Valuation Study request to OMB.
Page 13
The first pair of illustrations presents the current conditions at the MHI and conditions before
overfishing occurred there. The illustration on the left shows a representation of current
conditions at the MHI. The one on the right shows what the MHI looked like before overfishing
occurred. As compared with the status quo image (the image on the left), the illustration on the
right contains a greater number of fish, larger fish, the presence of fish schools in the
background, and less benthic macro algae.
The second pair of illustrations is designed to show the potential effects of increasing no-fishing
zones around the MHI from 1% to 25% of coral reef ecosystems in about 10 years. The first
illustration within this pair depicts the potential view of the reef if the status quo of 1% nofishing zones is maintained, and the second illustration captures the potential view with nofishing expansion to 25%. As compared with the status quo image, the second illustration
contains a greater number of fish, larger fish, the presence of fish schools in the background, and
less benthic macro algae.
The third pair of illustrations shows the potential effects of ship groundings on coral reef
ecosystems in the MHI. The illustration on the left depicts current conditions of the MHI coral
reefs. The illustration on the right shows how the current MHI scene would change immediately
following a ship grounding. The illustration contains a hull scar of crushed coral fragments, with
larger pieces of coral rubble forming berms to either side of the scar. The large coral heads have
been fractured, and there is a decrease in the number of fish present at the location. This
illustration is consistent with photographs of coral reef ship groundings from the Florida Keys.
Experimental design
This section describes the experimental design for the Coral Reef Valuation Study survey. The
developed design will be pretested using a subset of the overall design. Adjustments to the final
design for the main survey will be based upon the results of the pretest. The remainder of this
section describes the method and layout of the experimental design that will be used for the main
survey. We expect the attribute levels presented here to be final. If necessary, we will modify the
final cost estimates based on the results of the pretest.
There are three programs in the revised Coral Reef Valuation Study survey: (1) increasing the
no-fishing zones from 1% to 25% around the MHI (protecting reefs), (2) repairing reefs from
ship injuries so that injuries last 10 years rather than 50 years (repairing reefs), and (3)
implementing no-fishing zones and repairing reefs from ship injuries (both). Thus, there are two
attributes for the Coral survey: the percentage of Main Hawaiian Island reefs protected and the
years for reefs to be repaired from ship injuries. The individual programs, protecting reefs and
repairing reefs, have two levels apiece: the status quo or some positive action. As summarized in
Table A.1, the alternative levels for protecting and repairing reefs are 25% of reefs protected
versus 1% under the status quo, and injuries being repaired in 10 years rather than 50 years under
the status quo, respectively.
There are four possible combinations of attribute levels (referred to as alternatives) representing
the combinations of programs: the status quo, protecting reefs only, repairing reefs only, and
both protecting and repairing reefs. Because there are only four possible combinations, it is
possible to obtain a full ranking of a respondent’s preferences using only one choice set (with
Page 14
four alternatives).
We have assigned each attribute a vector of bid amounts to represent the cost of implementing
the program to produce the desired attribute levels (Table A.1). The bid amounts were selected
as follows. We used the results from the Phase I pretest to create a distribution of willingness to
pay (WTP) estimates for the no-fishing zones and reef repair programs. We then simulated
probabilities of a respondent selecting each alternative using the parameter estimates from the
pretest and randomized error terms. We experimented with the bids to rebalance the probabilities
and to capture the overall range of WTP values.
Table A.1. Program attributes and associated levels.
Attribute
Status quo level
Alternate level
Cost ($)
% reefs protected
1%
25%
30, 60, 80, 110
Years for reefs to be
repaired from ship injuries
50
10
20, 40, 60, 85
The bid amounts represent the cost of implementing the individual programs. For the program
that involves both protecting and repairing, the bid amount is equal to the total cost of the
program (i.e., the sum of the individual project costs) plus a bundling adjustment. The bundling
adjustment is included to test if respondents are willing to pay a different amount for the
combination of programs (both protecting and repairing reefs) than for the individual programs
separately This allows us to estimate an interaction term and to test whether this interaction term
is positive or negative. We have included two positive and one negative bundling adjustment to
account for respondents who are willing to pay less or more to have both programs implemented.
The bundling adjustment also accounts for the fact that the two programs could have economies
or diseconomies of scale. The bundling adjustments in this design are (-5), 0, 10, and 15.
The design and bid amounts will be pretested to reinforce our understanding of how people are
trading off the individual programs. We will use the results of the pretest to re-estimate the
parameters and repeat the above process to refine the distribution of WTP estimates that reflect
these trade-offs.
There are 16 possible choice sets for the main survey that contain all the different combinations
of individual program costs. In each choice set, the cost of the combined program is the sum of
the individual program costs plus a bundling adjustment. Each individual program cost level
appears four times in the design matrix, and each time it appears it is paired with a different
bundling adjustment. Table A.2 presents the current experimental design matrix. As stated
above, the methodology, layout, and attribute levels for the main survey will match this design,
but the cost estimates may be revised based on the results of the pretest.
Table A.2. Experimental design matrix.
Alternative
Protecting reefs:
% protected
Repairing reefs:
Years to recovery
Cost
($)
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $30
Page 15
Repairing reefs only
1%
10
Cost repairing = $20
Both
25%
10
$30 + $20 – (-5)
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $30
Repairing reefs only
1%
10
Cost repairing = $40
Both
25%
10
$30 + $40 - 0
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $30
Repairing reefs only
1%
10
Cost repairing = $60
Both
25%
10
$30 + $60 - 10
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $30
Repairing reefs only
1%
10
Cost repairing = $85
Both
25%
10
$30 + $85 - 15
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $60
Repairing reefs only
1%
10
Cost repairing = $20
Both
25%
10
$60 + $20 - 0
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $60
Repairing reefs only
1%
10
Cost repairing = $40
Both
25%
10
$60 + $40 – (-5)
Table A.2. Experimental design matrix (cont.).
Alternative
Protecting reefs:
% protected
Repairing reefs:
Years to recovery
Cost
($)
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $60
Repairing reefs only
1%
10
Cost repairing = $60
Both
25%
10
$60 + $60 - 15
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $60
Repairing reefs only
1%
10
Cost repairing = $85
Both
25%
10
$60 + $85 - 10
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $80
Repairing reefs only
1%
10
Cost repairing = $20
Both
25%
10
$80 + $20 - 10
Status quo
1%
50
$0
Page 16
Protecting reefs only
25%
50
Cost protecting = $80
Repairing reefs only
1%
10
Cost repairing = $40
Both
25%
10
$80 + $40 - 15
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $80
Repairing reefs only
1%
10
Cost repairing = $60
Both
25%
10
$80 + $60 – (-5)
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $80
Repairing reefs only
1%
10
Cost repairing = $85
Both
25%
10
$80 + $85 - 0
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $110
Repairing reefs only
1%
10
Cost repairing = $20
Both
25%
10
$110 + $20 - 15
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $110
Repairing reefs only
1%
10
Cost repairing = $40
Both
25%
10
$110 + $40 - 10
Page 17
Table A.2. Experimental design matrix (cont.).
Alternative
Protecting reefs:
% protected
Repairing reefs:
Years to recovery
Cost
($)
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $110
Repairing reefs only
1%
10
Cost repairing = $60
Both
25%
10
$110 + $60 - 0
Status quo
1%
50
$0
Protecting reefs only
25%
50
Cost protecting = $110
Repairing reefs only
1%
10
Cost repairing = $85
Both
25%
10
$110 + $85 – (-5)
The experimental design will be tested using a pretest. The experimental design used in the
pretest will be a subset of the matrix presented in Table A.2. It will consist of eight choice sets
formed by dropping the (-5) and 15 bundling adjustments. We will use the results from the
pretest to modify the cost options to accurately depict respondents’ preferences for the main
survey.
Use of stated choice questions
Stated choice methods have been identified as a useful tool to better understand individuals’
preferences and values for environmental amenities that are not traded in markets. While there is
some use of coral reef ecosystems around the MHI, protection of coral reefs there has a large
public good component. No markets are available to study the value of protecting and restoring
these coral reef ecosystems. Stated choice methods also allow for the evaluation of a full range
of management alternatives, including alternatives currently in force and novel combinations of
management alternatives like those being considered for implementation in Hawaii.
Stated choice methods are well established in the literature on environmental economics
(Kanninen, 2007). This approach evolved from conjoint analysis, a method used extensively in
marketing and transportation research (Louviere et al., 2000). 3 Conjoint analysis requires
respondents to rank or rate multiple alternatives where each alternative is characterized by
multiple characteristics (e.g., Johnson et al., 1995; Roe et al., 1996; Holmes and Adamowicz,
2003). Choice questions require respondents to choose the most preferred alternative (a partial
ranking) from multiple alternative goods (i.e., a choice set), where the alternatives within a
choice set are differentiated by their characteristics.
There are many desirable aspects of stated choice questions, not the least of which is the nature
of the choice being made. Choosing the most preferred alternative from some set of alternatives
is a common experience. Morikawa et al. (1990) note that responses to choice questions often
3. Cattin and Wittink (1982) and Wittink and Cattin (1989) survey the commercial use of conjoint analysis,
which is widespread. For survey articles and reviews of conjoint analysis, see Louviere (1988, 1992), Green
and Srinivasan (1990), and Batsell and Louviere (1991). Transportation planners use choice questions to
determine how commuters would respond to a new mode of transportation or a change in an existing mode.
Hensher (1994) gives an overview of choice questions applied in transportation.
Page 18
contain useful information on tradeoffs among characteristics. Quoting from Mathews et al.
(1997), stated choice “models provide valuable information for restoration decisions by
identifying the characteristics that matter to anglers and the relative importance of different
characteristics that might be included in a fishing restoration program.” Johnson et al. (1995)
note “The process of evaluating a series of pair wise comparisons of attribute profiles encourages
respondents to explore their preferences for various attribute combinations.” Choice questions
encourage respondents to concentrate on the tradeoffs between characteristics, rather than to take
a position for or against an initiative or policy. Adamowicz et al. (1998a) note that the repeated
nature of choice questions makes it difficult to behave strategically.
As mentioned previously, choice questions allow for the construction of goods characterized by
levels that currently do not exist. This feature is particularly useful in marketing studies whose
purpose is to estimate preferences for proposed goods, where various characteristics can be
manipulated in arriving at final product designs. 4 For example, Beggs et al. (1981) assess the
potential demand for electric cars. Similarly, researchers estimating the value of environmental
goods are often valuing a good or condition that does not currently exist, e.g., an MPA around
coral reef systems.
Choice questions, rankings, and ratings are increasingly used to estimate the value of
environmental goods. For example, Magat et al. (1988) and Viscusi et al. (1991) estimate the
value of reducing health risks; Adamowicz et al. (1994, 1998b, 2004), Breffle et al. (2005), and
Morey et al. (1999a) estimate recreational site choice models for moose hunting, fishing, and
mountain biking; Breffle and Rowe (2002) estimate the value of broad ecosystem attributes
(e.g., water quality, wetlands habitat); Adamowicz et al. (1998a) estimate the value of enhancing
the population of a threatened species; Layton and Brown (1998) estimate the value of mitigating
forest loss resulting from global climate change; and Morey et al. (1999b) estimate WTP for
monument preservation in Washington, DC. In each of these studies, a price (e.g., tax or a
measure of travel costs) is included as one of the characteristics of each alternative, so that
preferences for the other characteristics can be measured in terms of dollars. Other examples of
choice questions to value environmental commodities include Swait et al. (1998), who compare
prevention versus compensation programs for oil spills, and Mathews et al. (1997) and Ruby
et al. (1998), who ask anglers to choose between two saltwater fishing sites as a function of their
characteristics.
Alternatively, a number of environmental studies have used ratings, in which survey respondents
rate the degree to which they prefer one alternative to another. For example, Opaluch et al.
(1993) and Kline and Wichelns (1996) develop a utility index for the characteristics associated
with potential noxious facility sites and farmland preservation, respectively. Johnson and
Desvousges (1997) estimate WTP for various electricity generation scenarios using a rating scale
in which respondents indicate their strength of preference for one of two alternatives within each
choice set. Other environmental examples include Rae (1983), Lareau and Rae (1998), Krupnick
and Cropper (1992), Gan and Luzar (1993), and Mackenzie (1993). Adamowicz et al. (1998b)
provide an overview of choice and ranking experiments applied to environmental valuation, and
argue that choice questions better predict actual choices than do rating questions because choice
4. Louviere (1994) provides an overview of choice questions applied in marketing.
Page 19
questions mimic the real choices individuals are continuously required to make, whereas
individuals rank and rate much less often. 5
Choice and rating questions characterize the alternatives in terms of a small number of
characteristics. For example, Opaluch et al. (1993) characterize noxious facilities in terms of
seven characteristics; Adamowicz et al. (1998b) use six characteristics to describe recreational
hunting sites; Johnson and Desvousges (1997) use nine characteristics to describe electricity
generation scenarios; Mathews et al. (1997) use seven characteristics to describe fishing sites;
Morey et al. (1999a) use six characteristics to describe mountain bike sites; and Morey et al.
(1999b) use two characteristics to characterize monument preservation programs.
Focus groups and cognitive interviews conducted during Phase I of this project showed that a
solid foundation exists for the application of stated choice methods to the valuation of Hawaiian
coral reef ecosystems. The study subjects demonstrated a rudimentary understanding of coral
reefs and ecosystems based on schooling, nature programs, reading, and in some cases personal
experience. While many people had not personally visited coral reefs, or planned to ever use
them directly, they could understand both how the reefs were useful to others and their
ecological functions. We were able to build on this understanding with sufficient specific
information about the situation in the Hawaiian Islands to allow most subjects to feel
comfortable in expressing their preferences among alternatives. We also found that most subjects
had little or no difficulty with choice questions involving three alternatives and could identify
their most preferred and least preferred alternatives (the least preferred option was used in the
first pretest; the proposed pretest and main study would use four alternatives and would not use
the least preferred option). This allowed us to go beyond the conventional approach of asking
about two alternatives to gain some of the richness of ranking questions without forcing the
subject to come up with potentially artificial ranking for larger numbers of alternatives. In
particular, three alternative choice questions allow us to include the no-action alternative in all
the choice sets. This avoids the problems associated with forcing respondents to choose between
two alternatives, neither of which they find particularly desirable compared to doing and
spending nothing more.
Survey mode
Pretest survey
For the pretest survey, we plan to use KN’s established Web-enabled panel. See the discussion
below in “Main survey” for justification of using a Web-enabled survey instead of a telephone or
telephone mail survey.
Main survey
We propose to concurrently use two Web-enabled panels for the main survey. Independent of
this data collection effort, KN and Abt SRBI (with support and oversight from Stanford
University) are developing both panels. The first is the ANES Panel, built and administered by
5. See, for example, Louviere and Woodward (1983), Louviere (1988), and Elrod et al. (1992).
Page 20
KN. 6 Recruitment to this panel will be based on a list-assisted, RDD sample drawn from all
10-digit telephone numbers. The second is Stanford’s MRI Panel, built and administered by Abt
SRBI. The MRI Panel members will be selected based on a multistage probability sample of
residential mailing addresses. Abt SRBI will roster the household and then randomly select one
of the eligible members. The sample will be limited to households; group quarters (e.g., college
dormitories and nursing homes) will be excluded from the eligible target population.
We will use this standing panel, Web-based approach to overcome a set of potential problems
inherent in the research. As revealed in the focus groups and cognitive interviews conducted
during phase I, most people have some familiarity with coral reefs through nature programs and
other sources, but it will be necessary to convey more information to subjects than they could
easily comprehend if it were built into a simple telephone survey. Furthermore, while some
people are interested in and are concerned about coral reefs, many others are not. Hence, we
rejected the option of recruiting by telephone using RDD and following up with a mail survey,
since the low salience of the topic to many respondents could lead to a low response rate. There
is not any way to separate those nonrespondents who simply lack sufficient interest in coral reefs
(and hence have near-zero values) from those who did not respond for other reasons, such as an
inability to deal with large amounts of written information. We have concluded that an RDD and
in-person Web-enabled survey will be superior to a telephone or telephone mail survey for the
following reasons:
`
We can get higher response rates using the ANES and MRI Web-enabled panels that we
could with a telephone or telephone mail survey due to the low saliency of the topic to
people.
`
We can use pictures, graphical materials, voice-over, and other tools to communicate
information more effectively to respondents, ease the respondents’ burden, and maintain
interest among those for whom coral reef issues are of low salience.
`
We can easily and seamlessly make later questions conditional on responses to earlier
ones. Skip patterns are used to address this problem in mail surveys, and they often cause
difficulties as subjects get lost and skip questions or try to answer questions that do not
apply to them. Web-enabled surveys are programmed so that skip patterns are automatic.
`
The effectiveness of the stated choice questions can be enhanced by making the attributes
of the alternatives in later questions conditional on the choices made in earlier ones. This
is not possible in mail surveys.
`
We can avoid potential problems that can arise when respondents do not read the material
in a mail instrument at all or read it in a different order than the survey designers
intended. For example, one danger in this type of survey is that subjects in a mail survey
may try to complete stated choice questions before digesting the information needed to
answer them.
6. The ANES Web Panel is separate from KNs existing Web-enabled panel.
Page 21
`
Experimental treatments can be easily and independently randomized among
respondents.
`
We can track all stages of the recruitment process to provide a solid basis for evaluating
the representativeness of each of the samples in a more detailed way than can be
accomplished with most other survey implementation methods.
In addition, sampling is cost-effective for reaching both the main sample and, if desired, a
subsample of coral reef users.
Frequency of the information collection
The cognitive one-on-one interviews, the second pretest, and the main survey study are a onetime application.
How collection complies with NOAA information quality guidelines
Utility
The overall study goals were refined in Phase I of the project through interviews with key
stakeholder groups, including federal and state resource managers and members of the
U.S. CRTF. These initial interviews allowed us to identify key information needs. At critical
points throughout the study, we updated the key stakeholders on the status of the study. This
ensures that all information developed from this project will be transparent to all members of the
public.
The first pretest has allowed NOAA to further refine the survey instrument as to information
presentation, reliability, internal consistency, response variability, and other properties of a
newly developed survey. It has ensured that the information obtained from the survey is of the
highest quality. Due to recent changes in the survey instrument, we are proposing to implement
another pretest (following some cognitive one-on-one interviews) to test these changes and to
ensure proper programming of the instrument.
Objectivity
In developing the survey instrument, we are following state-of-the-art practices. Focus groups,
cognitive interviews, scientific fact peer review, and peer review of survey sample design,
question wording, the balance of information provided (acquiesce bias or leading people to adopt
a certain position), and nonmarket economic valuation methods have been conducted while
designing the current survey instrument. Internal and external peer reviews will be conducted on
all project products (e.g., survey instruments, sample designs, analyses, and reports). Peer review
will ensure that the information collected is accurate, reliable, and unbiased; and that the
information reported to the public is accurate, clear, complete, and unbiased. In our answer to the
section on “By Whom,” we detail the internal and external peer reviewers.
Page 22
Integrity
A separate file will be provided to all panel members in the survey, which will contain the
following statement:
Your participation in this survey is voluntary. All responses will be protected and any
material identifying you will not be provided to anyone.
KN will provide the ANES Panel members with its Panel Member Bill of Rights, included in this
submission (see Attachment 3). For a full discussion of KN and Abt SRBI’s procedures for
protection of information, see Question 10 of this supporting statement.
It is anticipated that the information collected will be disseminated to the public. As explained in
the preceding paragraphs, the information has utility. NOAA’s NOS will retain control over the
information and safeguard it from improper access, modification, and destruction, consistent
with NOAA standards of protection of information. The information collection is designed to
yield data that meet all applicable information quality guidelines. Prior to dissemination, the
information will be subjected to quality control measures (see Attachment 4 for KN and Abt
SRBI’s Quality Assurance Procedures) and a predissemination review pursuant to Section 515 of
Public Law 106-554.
3. Describe whether, and to what extent, the collection of information involves the use of
automated, electronic, mechanical, or other technological techniques or other forms of
information technology.
Automated, electronic data collection
Respondents will participate in the survey using either a home-based personal computer
connected to the Internet, a personal laptop computer with Internet service, or a Web-capable
appliance such as the MSN TV 2 with Internet service. Because we are one part of a larger
scientific study, it will be possible to give a Web-capable appliance and/or Internet access to
panelists who do not already have them. Non-Internet households participating in the ANES
Panel will receive MSN TV 2 Internet and Media Player and Internet Service at no expense. For
the MRI Panel, non-Internet houses will receive laptops with broad band Internet access at no
expense.
All Web-enabled panel surveys are self-administered, which allows respondents to complete the
surveys at their convenience and own pace, in the comfort and privacy of their homes. The
electronic survey system supports the inclusion of video, audio, and 3-D graphics in the
questionnaire. Respondents can break off and return to complete an interview during a second or
later session. The electronic data collection tracks how long respondents spend on each screen.
The data capture survey system, owned by KN, was designed to meet the specific needs of Webbased surveys. The system supports all types of questions commonly used in complex, computerbased interviewing systems. It uses advanced scripting techniques for customization of
individual questions to meet the needs of researchers proposing innovative designs. The data
capture platform supports the complexity and type of questions proposed in our study including
multimedia graphics, voice-over presentation.
Page 23
The system also supports the importation of auxiliary data, such as demographic information
collected as part of the screening. These data can be used to inform question logic, question
wording, etc.
See the section “Survey Mode” in answer to Question 2 for justification of using the KN Internet
technology for this application.
4. Describe efforts to identify duplication.
There are no published studies in the survey research literature that compare the results of asking
respondents questions about preservation and repair of nonmarket environmental goods, such as
coral reef ecosystems and comparing the responses to two, independently-recruited Internet
panels (e.g., the ANES and MRI Panels) with widely different response rates.
5. If the collection of information involves small businesses or other small entities, describe
the methods used to minimize burden.
This collection does not include collection of information involving small businesses or other
small entities.
6. Describe the consequences to the Federal program or policy activities if the collection is
not conducted or is conducted less frequently.
If this collection is not conducted, NOAA will lack the tools it will need in the future to conduct
surveys for determining public preferences for protection and repair of marine environmental
resources. This is a one-time collection for the cognitive one-on-one interviews, the second
pretest, and the main survey.
7. Explain any special circumstances that require the collection to be conducted in a
manner inconsistent with OMB guidelines.
Data collected from the ANES Panel is expected to achieve net survey response rates of 20%.
This is based on an expected 31% panel recruitment response rate [American Association for
Public Opinion Research (AAPOR) Rate No. 3], 75% connection rate (agree to join the panel
and completed the first online demographic survey), and 85% survey participation rate. The
ANES Web Panel is created and the first three waves (i.e., the first three months) of data
collection are completed. 7 The first wave of online data collection experienced a survey
participation rate in excess of 90%. The low overall response rate is due to the multistage
construction of the KN Panel.
For the recruited MRI Panel, we anticipate an overall response rate of about 63%. This is based
on a 90% participation rate for each monthly wave. 8
7. The panels are being recruited to administer a series of surveys over multiple months (waves).
8. This is based on the participation rate for the ANES Panel, though we expect the participation rate for the
MRI Panel to be similar.
Page 24
These estimates are based on the recruitment rates reported on other KN RDD and Abt SRBI inperson surveys and from participation rates reported in the industry and the effort designed into
the ANES/SRBI study to ensure high completion rates.
See the answer to Question 9 of this Supporting Statement on the use of incentives as a way of
increasing response rates and Part B, Question 2, which addresses the representativeness of the
Internet RDD and in-person panels.
8. Provide information on the PRA Federal Register Notice that solicited public comments
on the information collection prior to this submission. Summarize the public comments
received in response to that notice and describe the actions taken by the agency in response
to those comments. Describe the efforts to consult with persons outside the agency to obtain
their views on the availability of data, frequency of collection, the clarity of instructions
and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be
recorded, disclosed, or reported.
A Federal Register Notice, published on September 12, 2006 (71 FR 53667), solicited public
comment (see Attachment 7).
One set of comments was received from the Western Pacific Fishery Management Council
(WPFMC); however, these comments were based on a version of the survey instrument that is
different from the one included in this package; they were included in the original supporting
statement, but we prefer not to include them here again to avoid confusion. We also consulted
with the State of Hawaii on the policy/management options we evaluated in the survey.
9. Explain any decisions to provide payments or gifts to respondents, other than
remuneration of contractors or grantees.
Cognitive one-on-one interviews
For the cognitive one-on-one interviews, we plan to give participants $50 to compensate and
thank them for giving up 1 and ½ hours to participate in our interview.
Pretest survey
Pretest respondents will receive a $5 check for their participation. See the section below called
“Main survey” for more specific information on why we typically give incentives to respondents.
Main survey
Two types of respondent incentives are provided: nonsurvey-specific and survey-specific
incentives.
Page 25
Survey-specific incentives
KN and Abt SRBI will provide survey-specific incentives to respondents as a result of one of
two conditions: (1) the survey is expected to require more than 20 minutes of time to complete,
or (2) there is an unusual request being made of the respondent, such as specimen collection, the
viewing of a specific television program, or completion of a daily diary. In these kinds of
circumstances, panelists are being asked to participate in ways that are more burdensome than
initially described to respondents during the panel recruitment stage.
For the main survey, an incentive will be provided because the survey is expected to require 20
or more minutes to complete. Extra encouragement will be required because the survey will
require participants to read and digest more information than is the case in other types of surveys
and because the topic of the survey will be of limited salience for significant numbers of people.
Inclusion of an incentive acts as a sign of goodwill on the part of study sponsors and encourages
reciprocity by the respondent. Singer (2002) provides a review of the use of incentives in
surveys. In summary, her findings show that giving respondents a small monetary incentive
increases response rates. KN has analyzed the predictors of survey completion rates of studies
conducted using its Web-enabled panel. A multivariate analysis based on approximately 500 KN
surveys attempted to predict the effect of respondent incentives on survey completion rates while
controlling for length of field period, sample composition, use of video in the instrument, and
other factors. The effect of respondents’ incentives is significant (p< 0.01) for both $5 and
$10 cash-equivalent incentives. Use of a $5 incentive increased response by 4 percentage points
and a $10 incentive increased response by 6 percentage points. Internal KN research has
demonstrated that incentives increase the survey completion rate by approximately 5 percentage
points. The increase is larger for young adults and Hispanics.
ANES panel members who participate in the survey will be sent a check for $10 by U.S. mail for
their participation.
These measures are expected to contribute significantly to a survey completion rate of 90% for
both the ANES and MRI Panels.
Nonsurvey-specific incentives
Nonsurvey-specific incentives are used to maintain a high degree of panel loyalty and to prevent
attrition from the panel. Both the KN and Abt SRBI will provide panel members with Internet
connection and laptops or Web-capable devices if they do not already have them. For the
households provided with Internet appliances and an Internet connection, their incentive is the
hardware and Internet service. ANES Panel members will receive an MSN TV 2 Internet and
Media Player and Internet service. All MRI Panel members will be offered a laptop computer
and broadband Internet access.
10. Describe any assurance of confidentiality provided to respondents and the basis for
assurance in statute, regulation, or agency policy.
KN will conduct the survey for NOAA under subcontract to Stratus Consulting. They will
administer the survey to the ANES and the MRI panels even though they will be recruited
Page 26
independently. Neither Stratus Consulting nor NOAA or anyone else will receive name, address,
telephone number, or email address that can be used to identify a survey participant. Stratus
Consulting and NOAA will also not release survey data that might be used to identify an
individual who participated in the survey using the “rule of 10” 9 applied by the U.S. Census
Bureau. KN and Abt SRBI procedures to protect information follow.
Survey responses are protected, with identifying information never revealed without respondent
approval. When surveys are assigned to panel members, they receive notice in their passwordprotected email account that the survey is available for completion. Surveys are selfadministered and accessible any time of day for a designated period.
KN procedures
All ANES Panelists, when joining the panels, are given a copy of the Privacy and Term of Use
Policy. In the privacy terms, a section called the “Panel Member Bill of Rights” summarizes the
information protections for panelists and explains that respondents can decide whether to
participate in the panel or to answer any survey questions. The Bill of Rights is also available
electronically at all times to panelists via the panel member Website.
Below is a summary of the measures that will be taken to meet the needs for privacy and
confidentiality from the point of data access and information technology (IT).
First, all employees of KN are required to sign an agreement requiring them to protect all
personally identifiable information regarding panel members. KN warrants that all employees are
bound to protect all personal information provided by respondents, and very few employees
actually have access to any personal data. The only employees who have access to this
information are those with a direct need to know. Therefore, the only persons with access are the
following:
`
Database and IT administrators with access to computer servers for maintaining the
computers systems at KN.
`
Staff members in the Panel Relations department that have direct contact with panel
members as part of the inbound and outbound call center operations. These staff members
are responsible for troubleshooting any problems panelists might be having with their
equipment or software related to survey administration, incentive fulfillment, and panel
management.
`
Staff members of the Statistics department have access to personally identifying
information to draw samples for the various surveys conducted at KN.
All personally identifying records are kept secured in a separate office in the IT section of the
main offices in Menlo Park, California, and all data transfers from MSN TV 2 and WebTV units
9. As developed by the United States Census Disclosure Review Board, in some circumstances, presentation of
tabulation data on some population households must be rounded to 10s. The exact rounding scheme for
rounding to 10s is: 0 remains 0; 1-4 rounds to 0; 5-14 rounds to 10; 15-24 rounds to 20, and so on.
Page 27
and personal computers (both used for survey administration) to the main servers pass through a
firewall. KN never provides any respondent personal identifiers to any client or agency without
the explicit and informed consent provided by the sampled panel members. Unless explicitly
permitted as documented in a consent form, no personally identifying information will be
provided to any parties outside KN in combination with the survey response data.
All electronic survey data records are stored in a secured database that does not contain
personally identifying information. The staff members in the Panel Relations and Statistics
departments, who have access to the personally identifying information, do not have access to
the survey response data. Meanwhile, the staff members with access to the survey response data,
with the exception of the aforementioned database and IT administrators who must have access
to maintain the computer systems, do not have access to the personally identifying information.
The secured database contains field-specific permissions that restrict access to the data by type of
user, as described above, preventing unauthorized access.
Only an incremented ID number identifies the survey response data. The personally identifying
information is stored in a separate database that is accessible only to persons with a need to
know, as described above.
The survey data extraction system exports survey data identified only by the panel member ID
number. This ensures individual panel member anonymity. The data analysts with access to the
survey data extraction system, as they do not have access to personally identifying information,
cannot join survey data to personally identifying data. Panel Relations and Statistics staff do not
have access to the survey data extraction system, and therefore cannot join survey data to
personally identifying data.
As part of its work with Research Triangle Institute International on a survey conducted in
support of Food and Drug Administration (FDA) applications, KN implemented Good Clinical
Practice guidelines to ensure compliance with FDA requirements for systems documentation and
privacy of stored survey data. Consequently, a system of standard operating procedures is in
place for documenting all processes relating to maintaining confidentiality and privacy of the
identities of panel members.
KN retains the survey response data in its secure database after the completion of a project.
These data are retained for purposes of operational research, such as studies of response rates and
for the security of customers who might request at a later time additional analyses, statistical
adjustments, or statistical surveys that would require re-surveying research subjects as part of
validation or longitudinal surveys. The survey data for all the projects conducted on the ANES Web
Panel are also stored in a data vault maintained by Stanford University. The only person with access to
these data at Stanford University is the Director of Operations of the ANES.
A file will be provided to all panel members in the survey, which will contain the following
statement:
Your participation in this survey is voluntary. All responses are protected and any
material identifying you will not be provided to anyone outside of Knowledge
Networks. Also see the Knowledge Networks Bill of Rights.
Page 28
Abt SRBI procedures
For this study, Abt SRBI will recruit the MRI panel and KN will administer the survey. As a
member of CASRO (Council of American Survey Research Organization), Abt SRBI fully
abides by CASRO’s regulations in preserving respondent information, and will have the
following measures in place to do so.
Abt SRBI follows these routine practices:
`
Educating the research staff to the importance of confidentiality
`
Substituting codes for personal identifiers
`
Removing personal identifiers from data files
`
Limiting access to identifiable data
`
Storing identifiable data under security conditions
`
Maintaining personal identifying information only as long as required and only under
conditions specified in the study protocol
`
Properly disposing of records with identifying information as specified in the study
protocol.
Below is a summary of the measures that will be taken to meet the needs for information
protection from the point of data access and IT.
The AT&T NOC in which Abt SRBI co-locates their servers is a hardened facility with many
levels of security for protection of hardware and data. The facility is monitored 24 hours, 7 days
a week, by on-site professional security guards and monitored over continuous closed circuit
video surveillance from a command center via both stationary and 360° cameras located both
outside and inside the facility.
Access is controlled by electronic key cards and “Man Trap” with biometric palm scanners with
Individual PIN numbers. The cages of server racks and cabinets are also locked to prohibit
unauthorized access. Security breach alarms exist at each security point to prohibit bypassing the
access controls. Visitors must be pre-authorized individually before they are allowed to enter the
building.
Abt SRBI further controls access to their equipment through strong password requirements and
locking the desktop of each server automatically when idle or when a system administrator logs
in remotely. Their servers are protected on the network by Sonic Wall Pro 3060 firewalls, which
block access from the Internet except to pre-authorized Internet Protocol addresses and log all
access and intrusion attempts. The site is also secured from disaster by redundant bandwidth,
power, fire and heat detection, fire suppression, and ventilation systems.
Page 29
11. Provide additional justification for any questions of a sensitive nature, such as sexual
behavior and attitudes, religious beliefs, and other matters that are commonly considered
private.
No questions of a sensitive nature are asked in this survey.
12. Provide an estimate in hours of the burden of the collection of information.
Estimated number of respondents:
A.
Number of respondents for the cognitive one-on-one interviews: 32
B.
Number of respondents for the pretest: 250
C.
Number of respondents for the main study: 2,691
D.
Total number of respondents: 2,973
Estimated time per response:
A.
Cognitive one-on-one interviews: 90 minutes
B.
Pretest survey: 30 minutes
C.
Main survey: 30 minutes
Estimated total annual burden hours:
A.
Cognitive one-on-one interview burden hours: 48
B.
Pretest burden hours: 125
C.
Main survey burden hours: 1,345.5
D.
Total burden hours: 1,518.5
Estimated total annual cost to the public for the pretest and main survey: 1,518.5 hours (only
one-time application, no additional costs expected per respondent for this particular study).
13. Provide an estimate of the total annual cost burden to the respondents or recordkeepers resulting from the collection (excluding the value of the burden hours in #12
above).
No additional cost burden will be imposed on respondents aside from the burden hours indicated
above.
14. Provide estimates of annualized cost to the Federal government.
Table A.3 shows the annualized cost to the federal government during each phase of the project.
Page 30
Table A.3. Annualized cost to the federal government
Project phase
FY 03 FY 04
FY 05
FY 06
FY 07
$20k
$20k
FY 08
FY 09
Total
1. Questionnaire and sample design
a. Stratus Consulting contract
$80k
$80k
$30k
$230k
b. Steve Thur contract
$5k
$5k
$5
c. NOAA personnel travel
$5k
$5k
$5k
$5k
$20k
d. Peer review
$5k
$5k
$5k
$5k
$20k
$15k
2. Main survey implementation
a. Stratus Consulting contract
b. Peer review
$270k
$270k
$5k
$5k
3. Analysis and reporting
a. Stratus Consulting contract
b. Peer review
Total project
$95k
$95k
$45k
$30k
$20k
$275k
$250k
$250k
$5k
$5k
$255k
$815k
The entire project is spread over seven fiscal years. The contract with Stratus Consulting Inc.
includes all subcontracts to support questionnaire and sample design, main survey
implementation, analysis and reporting, and some peer review. NOAA is independently paying
for other peer review. Steven Thur was a contract employee until June 2005, when he became a
full-time NOAA employee. As a contract employee, a portion of his time was allocated to the
project. Dollars are reported in thousands of dollars ($80k means $80,000). Dollars are recorded
for the Stratus Consulting contract in the year of paid invoices not the date and amount of signed
contract.
The $530k cost of the final survey implementation and analysis and reporting will be incurred in
FY2008 and FY2009.
15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14
of the OMB 83-I.
This is a new survey.
16. For collections whose results will be published, outline the plans for tabulation and
publication.
The results of the main survey will be tabulated using simple summary statistical analyses of the
data (means, medians, standard deviations, maximums, and minimums). The main survey report
will include details on the methods of analysis, plans for tabulation, and publication of project
results. All project reports (pretest and main survey) will be posted online on the NOAA Website
(http://sanctuaries/noaa.gov/Socioeconomics) in PDF. All data files will be documented and
distributed via CD-ROM and/or online on the NOAA Website.
The results of the pretest have not yet been made public, except for inclusion in the supplemental
statement for this OMB approval of the main survey implementation. Pre-test results
Page 31
documenting how estimates of the total economic value were derived are included in this
submission.
17. If seeking approval to not display the expiration date for OMB approval of the
information collection, explain the reasons why display would be inappropriate.
NA.
18. Explain each exception to the certification statement identified in Item 19 of the
OMB 83-I.
NA.
Page 32
File Type | application/pdf |
File Title | SUPPORTING STATEMENT |
Author | Richard Roberts |
File Modified | 2009-03-24 |
File Created | 2009-03-24 |